Meta has been told its treatment of high-profile users, such as former US President Donald Trump, left dangerous content online, serving business interests at the expense of its human rights obligations.
A damning report published on Tuesday from the company's oversight board—a "Supreme Court"-style body created by the parent company of Facebook, Instagram, and WhatsApp to rule on sensitive moderation issues—has urged the social media giant to make "significant" changes to its internal system for reviewing content from politicians, celebrities, and its business partners.
The board, which started assessing cases last year, is coordinated by the tech giant's policy chief and former UK deputy prime minister Sir Nick Clegg and issues independent judgments on high-profile moderation cases as well as recommendations on certain policies.
The board was asked to look into the system after The Wall Street Journal and whistleblower Frances Haugen revealed its existence last year, raising concerns that Meta was giving preferential treatment to elite figures.
Clegg also has until January 7 to decide whether to allow Trump back on to the platform following a separate recommendation by the board.
After a lengthy investigation spanning more than a year, the board has demanded that Meta more closely audit who is on the so-called "cross-check" list and be more transparent about its review procedures.
The report is one of the most in-depth probes yet into moderation issues at Meta, as the independent body—comprising 20 journalists, academics, and politicians—has grappled with concerns that it has little power to hold the company accountable.
It piles further pressure on chief executive Mark Zuckerberg, who last month announced plans to cut 11,000 staff amid declining revenues and growth, to ensure Meta's content is policed fairly.
Meta has already begun to revamp the system. In a blog post on Tuesday, Clegg said it was originally developed to "double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe." He added that the company had now developed a more standardized system, with further controls and annual reviews.
It remains unclear how many people are on the secretive list. The Wall Street Journal, which first reported the list, estimated that by 2020, there were 5.8 million users listed. Meta has previously said there were 666,000 as of October 2021.
The system meant that content posted by well-known personalities, such as Trump and Elizabeth Warren, would remain on platforms until human moderators had reviewed them, even if the messages would have been automatically removed had they been posted by an ordinary user.
It would take five days on average for this human review to take place, with the content left on the platform during this time, and in one case, up to seven months, the report found.
Meta's "own understanding of the practical implications of the program was lacking," the board said, adding that the company had failed to assess whether the system worked as intended.
The board also accused the company of giving "insufficient" responses to the investigation, sometimes taking months to respond.
The board referenced a Wall Street Journal report that detailed how Brazilian footballer Neymar posted non-consensual intimate imagery of another person on to his Facebook and Instagram accounts, which was viewed more than 50 million times before removal. According to Meta, this was because of a "delay in reviewing the content due to a backlog at the time."
Thomas Hughes, director of the oversight board, said the Neymar incident was one example of how business partnerships could impact moderation processes.
"It opens up concerns… about relationships between individuals in the company and whether that might influence decision-making," he said.
"There was probably a conflation of different interests within this cross-check process," he added.
The report follows previous public tensions between the board and Meta after the former accused the social media company in September 2021 of withholding information on the system. Many see the board as an attempt to create distance between the company's executives and difficult decisions around free speech.
Meta now has 90 days to respond to the recommendations.
- Donald Trump: ‘I Am Your Voice’
- For Whites Sensing Decline, Donald Trump Unleashes Words of Resistance
- Donald Trump Is Not Going Anywhere
- QAnon followers believe Donald Trump used "body double" at Arizona rally
- How Ron DeSantis feud with Donald Trump could benefit Florida governor
- Donald Trump cancels press conference planned for anniversary of Jan 6 Capitol riot
- Who is Rinat Akhmetshin, the former Soviet intelligence officer in the Donald Trump Jr. meeting?
- How Donald Trump Jr. Helped Push the Now Highly Controversial Gun Silencer Bill
- After fact-check, Twitter flags Donald Trump’s tweet for ‘glorifying violence’
- Donald Trump’s final China scorecard: A story of many defeats, and one big change
- Prosecute Donald Trump for the Jan. 6 Coup Attempt?
- In a Defiant, Angry Speech, Donald Trump Defends Image Seen as Anti-Semitic
- Donald Trump tweets Apple tariff threat using an iPhone
- Donald Trump just said he wants 6G technology, but it doesn't exist
- Iran president calls Donald Trump's government 'worst in the history of America'
- Donald Trump mimics Joe Biden to laughing Arizona rally crowd
- After Donald Trump Mocks Greta Thunberg With ‘Chill, Go To Movies’ Barb, She Does This…
- Ron DeSantis has been closing the gap on Donald Trump: GOP primary polls
- Capitol riot: Where are Donald Trump's QAnon followers one year on?
- 'F--- him' - Donald Trump launches tirade at former ally Benjamin Netanyahu
Meta told to overhaul moderation system for high-profile users like Donald Trump have 931 words, post on arstechnica.com at December 6, 2022. This is cached page on TechNews. If you want remove this page, please contact us.