Meta, which owns Facebook and Instagram, took an unusual step last week: It suspended some of the quality controls that ensure that posts from users in Russia, Ukraine and other Eastern European countries meet its rules.
Under the change, Meta temporarily stopped tracking whether its workers who monitor Facebook and Instagram posts from those areas were accurately enforcing its content guidelines, six people with knowledge of the situation said. That’s because the workers could not keep up with shifting rules about what kinds of posts were allowed about the war in Ukraine, they said.
Meta has made more than half a dozen content policy revisions since Russia invaded Ukraine last month. The company has permitted posts about the conflict that it would normally have taken down — including some calling for the death of President Vladimir V. Putin of Russia and violence against Russian soldiers — before changing its mind or drawing up new guidelines, the people said.
The result has been internal confusion, especially among the content moderators who patrol Facebook and Instagram for text and images with gore, hate speech and incitements to violence. Meta has sometimes shifted its rules on a daily basis, causing whiplash, said the people, who were not authorized to speak publicly.
The bewilderment over the content guidelines is just one way that Meta has been roiled by the war in Ukraine. The company has also contended with pressure from Russian and Ukrainian authorities over the information battle about the conflict. And internally, it has dealt with discontent about its decisions, including from Russian employees concerned for their safety and Ukrainian workers who want the company to be tougher on Kremlin-affiliated organizations online, three people said.
Meta has weathered international strife before — including the genocide of a Muslim minority in Myanmar last decade and skirmishes between India and Pakistan — with varying degrees of success. Now the largest conflict on the European continent since World War II has become a litmus test of whether the company has learned to police its platforms during major global crises — and so far, it appears to remain a work in progress.
“All the ingredients of the Russia-Ukraine conflict have been around for a long time: the calls for violence, the disinformation, the propaganda from state media,” said David Kaye, a law professor at the University of California, Irvine, and a former special rapporteur to the United Nations. “What I find mystifying was that they didn’t have a game plan to deal with it.”
Dani Lever, a Meta spokeswoman, declined to directly address how the company was handling content decisions and employee concerns during the war.
After Russia invaded Ukraine, Meta said it established a round-the-clock special operations team staffed by employees who are native Russian and Ukrainian speakers. It also updated its products to aid civilians in the war, including features that direct Ukrainians toward reliable, verified information to locate housing and refugee assistance.
Mark Zuckerberg, Meta’s chief executive, and Sheryl Sandberg, the chief operating officer, have been directly involved in the response to the war, said two people with knowledge of the efforts. But as Mr. Zuckerberg focuses on transforming Meta into a company that will lead the digital worlds of the so-called metaverse, many responsibilities around the conflict have fallen — at least publicly — to Nick Clegg, the president for global affairs.
Last month, Mr. Clegg announced that Meta would restrict access within the European Union to the pages of Russia Today and Sputnik, which are Russian state-controlled media, following requests by Ukraine and other European governments. Russia retaliated by cutting off access to Facebook inside the country, claiming the company discriminated against Russian media, and then blocking Instagram.
This month, President Volodymyr Zelensky of Ukraine praised Meta for moving quickly to limit Russian war propaganda on its platforms. Meta also acted rapidly to remove an edited “deepfake” video from its platforms that falsely featured Mr. Zelensky yielding to Russian forces.
The company has made high-profile mistakes as well. It permitted a group called the Ukrainian Legion to run ads on its platforms this month to recruit “foreigners” for the Ukrainian army, a violation of international laws. It later removed the ads — which were shown to people in the United States, Ireland, Germany and elsewhere — because the group may have misrepresented ties to the Ukrainian government, according to Meta.
Internally, Meta had also started changing its content policies to deal with the fast-moving nature of posts about the war. The company has long forbidden posts that might incite violence. But on Feb. 26, two days after Russia invaded Ukraine, Meta informed its content moderators — who are typically contractors — that it would allow calls for the death of Mr. Putin and “calls for violence against Russians and Russian soldiers in the context of the Ukraine invasion,” according to the policy changes, which were reviewed by The New York Times.
This month, Reuters reported on Meta’s shifts with a headline that suggested that posts calling for violence against all Russians would be tolerated. In response, Russian authorities labeled Meta’s activities as “extremist.”
Shortly thereafter, Meta reversed course and said it would not let its users call for the deaths of heads of state.
“Circumstances in Ukraine are fast moving,” Mr. Clegg wrote in an internal memo that was reviewed by The Times and first reported by Bloomberg. “We try to think through all the consequences, and we keep our guidance under constant review because the context is always evolving.”
Meta amended other policies. This month, it made a temporary exception to its hate speech guidelines so users could post about the “removal of Russians” and “explicit exclusion against Russians” in 12 Eastern European countries, according to internal documents. But within a week, Meta tweaked the rule to note that it should be applied only to users in Ukraine.
The constant adjustments left moderators who oversee users in Central and Eastern European countries confused, the six people with knowledge of the situation said.
Russia-Ukraine War: Key Developments
Ongoing peace talks. During peace talks between Russia and Ukraine in Istanbul, Russia promised it would “reduce military activity” near Kyiv, and Ukraine said it was ready to declare itself permanently neutral. Even so, weeks of further negotiation may be needed to reach an agreement, and Russia appears determined to capture more territory in eastern Ukraine.
The policy changes were onerous because moderators were generally given less than 90 seconds to decide on whether images of dead bodies, videos of limbs being blown off, or outright calls to violence violated Meta’s rules, they said. In some instances, they added, moderators were shown posts about the war in Chechen, Kazakh or Kyrgyz, despite not knowing those languages.
Ms. Lever declined to comment on whether Meta had hired content moderators who specialize in those languages.
Emerson T. Brooking, a senior fellow at the Digital Forensic Research Lab of the Atlantic Council, which studies the spread of online disinformation, said Meta faced a quandary with war content.
“Usually, content moderation policy is intended to limit violent content,” he said. “But war is an exercise in violence. There is no way to sanitize war or to pretend that it is anything different.”
Meta has also faced employee complaints over its policy shifts. At a meeting this month for workers with ties to Ukraine, employees asked why the company had waited until the war to take action against Russia Today and Sputnik, said two people who attended. Russian state activity was at the center of Facebook’s failure to protect the 2016 U.S. presidential election, they said, and it didn’t make sense that those outlets had continued to operate on Meta’s platforms.
While Meta has no employees in Russia, the company held a separate meeting this month for workers with Russian connections. Those employees said they were concerned that Moscow’s actions against the company would affect them, according to an internal document.
In discussions on Meta’s internal forums, which were viewed by The Times, some Russian employees said they had erased their place of work from their online profiles. Others wondered what would happen if they worked in the company’s offices in places with extradition treaties to Russia and “what kind of risks will be associated with working at Meta not just for us but our families.”
Ms. Lever said Meta’s “hearts go out to all of our employees who are affected by the war in Ukraine, and our teams are working to make sure they and their families have the support they need.”
At a separate company meeting this month, some employees voiced unhappiness with the changes to the speech policies during the war, according to an internal poll. Some asked if the new rules were necessary, calling the changes “a slippery slope” that were “being used as proof that Westerners hate Russians.”
Others asked about the effect on Meta’s business. “Will Russian ban affect our revenue for the quarter? Future quarters?” read one question. “What’s our recovery strategy?”