Meta, the owner of Facebook and Instagram, was heavily criticized by a company-appointed oversight board on Tuesday for policies that give celebrities, politicians and business associates special treatment compared to the vast majority of its users.
According to the Oversight Board, which Meta created to adjudicate thorny policy questions related to free speech, human rights and content moderation.
“The board is concerned about how Meta has prioritized business interests when moderating content,” the board said in a report. The cross-checking program, it said, “provides additional protection for certain users’ expressions.”
The oversight board recommended that Meta overhaul its cross-checking system by “radically” increasing transparency about who is on the program’s list of VIPs and hiding their posts while they are being reviewed. Meta should prioritize speech, which is “of particular public interest,” it added. Recommendations of the board, which consists of about 20 academics, human rights experts and lawyers, are not binding.
The report was a reminder of the power social networks have to decide which posts to keep, which to delete, and how to treat specific accounts. Twitter, Facebook, TikTok and others have long come under scrutiny for making one-sided statements about content on their platforms that can influence political debates and social issues.
More about Big Tech
- Microsoft: The company’s $69 billion deal for Activision Blizzard, which rests on winning the approval of 16 governments, has become a test of whether tech giants can buy companies amid a backlash.
- Apple: Apple’s largest iPhone factory, in the city of Zhengzhou, China, is facing labor shortages. Now that factory is getting help from an unlikely source: the Chinese government.
- Amazon: The company appears to be planning to lay off about 10,000 people in business and technology jobs, in what would be the largest cuts in the company’s history.
- meta: Facebook’s parent company said it laid off more than 11,000 people, or about 13 percent of its workforce
Elon Musk, the new owner of Twitter, is now in the spotlight because of how his social media service will moderate content. Twitter had policies in place to bar disinformation and hate speech from the platform, but Mr Musk has said he believes in unfettered discourse and has dropped enforcement of some of those policies.
Meta has de-emphasized its social networking activities in recent months following criticism of toxic content on those platforms. Mark Zuckerberg, the company’s CEO, has instead prioritized entering the immersive digital world of the metaverse. Meta has spent billions of dollars on the shift, though it’s unclear whether consumers will embrace metaverse-related products. The company recently laid off more than 11,000 employees, or about 13 percent of its workforce.
“Meta’s call for a major overhaul of content moderation rules will create more fairness and a level playing field, holding VIP profile users to the same rigorous standards of the regular user,” said Brian Uzzi, a professor at the Kellogg School of Management. at Northwestern University. “To avoid chaos, there should be one rule to rule them all.”
Nick Clegg, Meta’s vice president of global affairs, said Tuesday that Meta created the cross-checking system to prevent falsely deleted messages from having an excessive impact. He said the company would respond to the board’s report within 90 days.
The oversight board began investigating the cross-checking program last year after its existence was reported by The Wall Street Journal and a whistleblower, Frances Haugen. The board sharply criticized the company last year for not being transparent about the program.
On Tuesday, the oversight board found that the cross-checking program caused high-profile users to receive additional review from a human moderator before their posts were removed for violating the company’s terms of service. The board criticized the company for its lack of transparency and the “unequal treatment” of the most influential and powerful users of Facebook and Instagram at the expense of human rights and company values. Meta took seven months to reach a final decision on a piece of content posted by an account in the cross-check program, the report said.
Mr. Zuckerberg had pushed for the oversight board to be created so that his company would not be the only entity involved in content moderation decisions. Since the board began hearing cases in fall 2020, it has filed a number of substantive objections to Meta’s actions.
In 2021, the board recommended that Meta recover photos of post-operative breasts that the company’s automated systems had removed for nudity reasons. The photos, which Meta has recovered, were posted by a Brazilian Instagram user promoting a breast cancer awareness campaign. The board criticized Meta’s reliance on automated systems to remove posts.
The board also considered Meta’s exclusion of former President Donald J. Trump from Facebook and Instagram following the January 2021 U.S. Capitol riot. In May 2021, the board said that Meta should reconsider its decision to exclude Mr. adding that the company did not have the proper systems in place to permanently suspend the former president.
Mr. Trump was part of the cross-checking program. The board chided Meta for not being “fully candid” in her cross-checking revelations, including what numbers were part of it.
Mr Clegg has since said that Meta will decide whether Mr Trump’s bills can be reinstated by January.
Thomas Hughes, the director of the board of trustees, said Tuesday’s report was “an important step in the board’s ongoing efforts to bring greater accountability, consistency and fairness to Meta’s platforms.”
Other social media companies have tried to replicate Meta’s board of trustees system. After Mr. Musk took over Twitter, he said he planned to create a “content moderation board”. He failed to carry out that plan and blames activists and investors for pressuring him to follow Meta’s model.
Meta also faces the prospect of not being able to display personalized ads in the European Union without users’ prior consent. Decisions approved this week by a European data protection authority require the company to allow users of Facebook and Instagram to opt out of advertising based on personal data collected by Meta, a person familiar with the decision said. .
A final ruling, which can be appealed, is expected to be announced next month by Irish authorities, who serve as Meta’s main data privacy regulator in Europe because the company’s EU headquarters are located in Dublin.