Lifestyle

America “Burning”: Facebook was watching Trump ignite hatred | Lifestyle

Columbus, Ohio — A flood of reports of hateful and violent posts on Facebook shortly after President Donald Trump sent a warning to social media on the night of May 28 last year that a predator would be shot dead in Minneapolis. I started to do it.

It’s been three days since Minneapolis police officer Derek Chauvin knelt on George Floyd’s neck for more than eight minutes, and a 46-year-old black man lost consciousness and showed no signs of life. Videos shot by bystanders have been watched millions of times online. Protests hijacked Minnesota’s largest city and quickly spread throughout the United States.

However, after Trump posted about Floyd’s death, reports of violence and hate speech increased “rapidly” on Facebook across the country, an internal analysis of a former president’s social media post revealed.

“These thugs disgrace George Floyd’s memory, and I won’t let it happen,” Trump wrote from his Twitter and Facebook accounts at 9:53 am on May 28. rice field. “We take control of any difficulty, but when looting begins, shooting begins!”

After that, the former president was suspended from both Twitter and Facebook.

The leaked Facebook document caused more anger in a country where Trump’s social media posts were finally “ignited” with hate speech and reports of violence across the platform, already deeply divided. I’m looking directly at the method. Facebook’s own internal auto-control aimed at catching posts that violate the rules predicted with almost 90% certainty that Trump’s message broke the tech company’s rules for inciting violence. ..

Still, the tech giant took no action in response to Trump’s message.

The next day, offline, protests, some of which became violent, involved almost all US cities, large and small.

“When people look back at the role Facebook played, I don’t say Facebook caused it, but Facebook was certainly a megaphone,” said Reiner Holt, a professor of communications at Ohio State University. rice field. “I don’t think there’s a way out of saying they made the situation worse.”

Meanwhile, social media rival Twitter responded swiftly at the time by covering Trump’s tweets with warnings and prohibiting users from sharing any more.

Facebook’s internal discussions were revealed in a disclosure made to the Securities and Exchange Commission and provided to Congress in a form edited by Franceshausen’s legal adviser, who turned from a former Facebook employee to a whistleblower. rice field. The edited version received by Congress was obtained by a consortium of media outlets, including the Associated Press.

The Wall Street Journal previously reported that Trump was one of many prominent users, including politicians and celebrities, and was exempt from some or all of the company’s normal enforcement policies.

Hate speech and reports of violence were mostly confined to the Minneapolis area after Floyd’s death, the document said.

“But after Trump’s post on May 28, the situation really escalated nationwide,” according to a memo released on June 5, last year.

According to internal analysis, reports of violence on Facebook have increased five-fold, and hate speech complaints have tripled in the days following Trump’s posting. Fake news reports on the platform have doubled. Resharing Trump’s message generated “a fair amount of hateful and violent comments,” many of which Facebook worked on removing. Some of those comments included a call to “start shooting these thugs” and “f — white”.

By June 2, Facebook employees said in a June 5 memo that increased reports of hate speech and violence “clearly made the whole country basically a” fire. ” I understand. “

Facebook states that it is impossible to distinguish between the number of hate speech reports caused by Trump’s post itself or the controversy over Floyd’s death.

A Facebook spokeswoman said in a statement, “This surge in user reports came from a crucial moment in the history of the racial justice movement, from a single post by Donald Trump on it. I didn’t. ” “Facebook often reflects what’s happening in society, and the only way to prevent user report spikes at these moments is to not allow user report discussions on the platform at all. That’s what we never do. “

But internal findings also question the official statement made by Facebook CEO Mark Zuckerberg in support of his decision to leave Trump’s post in place.

For example, on May 29, Zuckerberg said he scrutinized whether Trump’s words violated the policy and concluded that he did not. Zuckerberg also said he left a post because Trump warned people about plans to deploy troops.

“I know many people are angry about leaving the presidential post, but as much as possible unless it causes the specific harm or imminent risk of danger specified in the clear policy. We should be able to express that, “Zuckerberg wrote. On the night of May 29, his Facebook account broke out in protests across the country.

Nevertheless, Facebook’s own automatic enforcement control determined that the post was likely to violate the rules.

“Our violence and sedition classifier was almost 90% sure that this (Trump) post violated Facebook’s … policy,” said a June 5 analysis. increase.

This contradicts Zuckerberg’s conversation with civil rights leaders last year to calm concerns that Trump’s post is a particular threat to blacks protesting Floyd’s death, said the Civil Rights Advocate Color of -Change President Rashad Robinson said. The group also led Facebook’s boycott in the weeks following Trump’s post.

In an interview with AP last week, Robinson said, “For clarity, I had a direct discussion with Zuckerberg a few days after the post, and he illuminated me with a gas lamp, which violated their rules. It pushed the idea back clearly. “

Facebook employees last year proposed limiting the re-sharing of similar posts that could violate Facebook’s rules, in order to curb the ability of the former president to provoke unpleasant reactions on the platform. ..

However, Mr. Trump continued to use his Facebook account, which is followed by more than 32 million people, to dismiss his supporters throughout most of the rest of his presidency. In the days leading up to the deadly siege in Washington on January 6, Trump lost the White House in a widespread fraudulent vote, and hundreds of fans attacked the US Capitol, demanding fair results. I regularly advertised the false claim. The election is overturned.

Facebook pulled him off the platform in January as Trump was leaving the White House after the Capitol riots, and announced that his account would be suspended until at least 2023.

Jennifer Mercieca, a professor at Texas A & M University who scrutinized the former president’s rhetoric, said there was a reason Facebook had been waiting for action for a long time.

“Facebook really benefited from Trump and his ability to attract attention and engagement through anger,” Mercieka said. “They wanted Trump to continue.”

America “Burning”: Facebook was watching Trump ignite hatred | Lifestyle

Source link America “Burning”: Facebook was watching Trump ignite hatred | Lifestyle

Related Articles

Back to top button