Meta, owner of Facebook and Instagram, took an unusual step last week: it suspended some quality checks to ensure that posts from users in Russia, Ukraine and other Eastern European countries are compliant.
Under the change, Meta temporarily stopped tracking whether its employees who follow Facebook and Instagram posts from those areas were accurately enforcing content guidelines, six people with knowledge of the situation said. That’s because the workers couldn’t keep up with changing rules about what kinds of reports about the war in Ukraine were allowed, they said.
Meta has made more than half a dozen content policy revisions since Russia invaded Ukraine last month. The company has allowed reports of the conflict that it would normally have brought down — including some calling for the death of President Vladimir V. Putin of Russia and violence against Russian soldiers — before changing its mind or drafting new guidelines, it said. the people.
The result is internal confusion, especially among the content moderators who patrol Facebook and Instagram for bloody text and images, hate speech and incitement to violence. Meta has sometimes shifted his rules daily, causing whiplash, said the people, who were not authorized to speak publicly.
The bewilderment at the content guidelines is just one way Meta has been ripped off by the war in Ukraine. The company has also faced pressure from the Russian and Ukrainian authorities over the information battle over the conflict. And internally it has allayed dismay at his decisions, including from Russian workers who are concerned for their safety and Ukrainian workers who want the company to crack down on Kremlin-affiliated organizations online, three people said.
Meta has endured international struggles before — including the genocide of a Muslim minority in Myanmar over the past decade and clashes between India and Pakistan — with mixed success. Now the biggest conflict on the European continent since World War II has become a litmus test of whether the company has learned to guard its platforms during major global crises – and so far it appears to remain a work in progress.
“All the ingredients of the conflict between Russia and Ukraine have been around for a long time: the calls to violence, the disinformation, the state media propaganda,” said David Kaye, a law professor at the University of California, Irvine, and a former Special Rapporteur to the United Nations. “What I find baffling was that they didn’t have a game plan on how to deal with it.”
Dani Lever, a spokeswoman for Meta, declined to comment directly on the company’s handling of substantive decisions and employee concerns during the war.
After Russia invaded Ukraine, Meta said it set up a 24-hour special operations team, staffed by Russian and Ukrainian native speakers. It has also updated its products to help civilians in war, including features that direct Ukrainians to reliable, verified information to find housing and refugee assistance.
Mark Zuckerberg, chief executive of Meta, and chief operating officer Sheryl Sandberg, are directly involved in the response to the war, said two people with knowledge of the effort. But while Mr. Zuckerberg focuses on transforming Meta into a company that will lead the digital worlds of the so-called metaverse, many of the responsibilities surrounding the conflict have fallen — at least publicly — to Nick Clegg, the president for global affairs.
Last month, mr. Clegg announced that Meta would restrict access within the European Union to the pages of Russia Today and Sputnik, which are Russian state-controlled media, at the request of Ukraine and other European governments. Russia retaliated by cutting off access to Facebook in the country, claiming the company discriminated against Russian media and then blocking Instagram.
This month, President Volodymyr Zelensky of Ukraine acclaimed Meta for acting quickly to limit Russian war propaganda on its platforms. Meta also acted quickly to remove an edited “deepfake” video from its platforms in which Mr. Zelensky mistakenly surrendered to Russian troops.
The company has also made high-profile mistakes. It’s allowed a group called the Ukrainian Legion to run ads on its platforms this month to recruit “foreigners” for the Ukrainian military, a violation of international laws. It later removed the ads — which were shown to people in the United States, Ireland, Germany and elsewhere — because, according to Meta, the group may have misrepresented its ties to the Ukrainian government.
Internally, Meta had also begun to change its content policies to cope with the rapidly changing nature of wartime messages. The company has long had banned posts that could incite violence. But on February 26, two days after Russia invaded Ukraine, Meta informed the content’s moderators — who are typically contractors — that it would allow calls for Mr Putin’s death and “calls for violence against Russians and Russian soldiers.” in the context of the invasion of Ukraine,” said the policy changes, which were reviewed by The New York Times.
This month, Reuters reported on Meta’s services with: a newspaper headline that suggested that messages calling for violence against all Russians would be tolerated. In response, Russian authorities labeled Meta’s activities as “extremist.”
Soon after, Meta backed down and said it would not let its users incite the deaths of heads of state.
“Conditions in Ukraine are moving quickly,” Clegg wrote in an internal memo reviewed by The Times and first reported by Bloomberg. “We try to think through all the consequences and we constantly monitor our guidance because the context is always evolving.”
Meta has changed other policies. This month, it made a temporary exception to its hate speech guidelines, allowing users to post messages about the “removal of Russians” and “explicit exclusion of Russians” in 12 Eastern European countries, according to internal documents. But within a week, Meta amended the rule to note that it should only be applied to users in Ukraine.
The constant adjustments caused confusion among moderators who monitor users in Central and Eastern European countries, the six people with knowledge of the situation said.
The policy changes were tricky because moderators were generally given less than 90 seconds to decide whether images of dead bodies, videos of limbs blown off or outright calls for violence violated Meta’s rules, they said. In some cases, she added, moderators were shown messages about the war in Chechen, Kazakh or Kyrgyzstan, despite not knowing those languages.
Ms. Lever declined to comment on whether Meta had hired content moderators specializing in those languages.
Emerson T. Brooking, a senior fellow at the Atlantic Council’s Digital Forensic Research Lab, which studies the spread of disinformation online, said Meta faced a dilemma with wartime content.
“Usually the content moderation policy is intended to limit violent content,” he said. “But war is an exercise in violence. There is no way to purge war or pretend it is something else.”
Meta has also faced complaints from employees about its policy shifts. At a meeting this month for workers with ties to Ukraine, workers asked why the company waited until the war to… take action to Russia Today and Sputnik, two attendees said. The Russian state’s activities were central to Facebook’s failure to protect the 2016 US presidential election, they said, and it didn’t make sense for those outlets to continue operating on Meta’s platforms.
Although Meta has no employees in Russia, the company held a separate meeting this month for employees with Russian connections. Those employees said they feared Moscow’s actions against the company would affect them, according to an internal document.
In discussions on Meta’s internal forums, which were reviewed by The Times, some Russian employees said they had deleted their workplace from their online profiles. Others wondered what would happen if they worked in the company’s offices in places with extradition treaties to Russia and “what risks will be associated with working at Meta, not only for us, but also for our families.”
Ms. Lever said Meta’s heart goes out to all of our employees affected by the war in Ukraine, and our teams are working to ensure they and their families receive the support they need.
At a separate company meeting this month, according to an internal poll, some employees expressed their dismay at the changes in speech policy during the war. Some asked if the new rules were necessary, calling the changes “a slippery slope” that “was used as proof that Westerners hate Russians”.
Others asked about the effect on Meta’s activities. “Will the Russian ban affect our earnings for the quarter? Future quarters?” read a question “What is our recovery strategy?”