From CJR: War in Ukraine Is Latest Platform Moderation Challenge

From CJR’s The Media Today by Mathew Ingram:

War in Ukraine is the latest platform moderation challenge

ON MARCH 10, A REUTERS HEADLINE announced that Facebook would temporarily change its content rules to allow users to post calls for the death of Vladimir Putin, Russia’s president, as well as “calls for violence against Russians.” (Reuters later modified its headline to specify”invading Russians.”) Such posts would normally qualify as what Meta calls “T1 violent speech,” which is removed without question. A few days later, Nick Clegg, head of global affairs for Meta, the parent company of Facebook, said the new rules would not allow users to call for the death of Putin or Alexander Lukashenko, the president of Belarus. Clegg also said that calling for violence against Russians would only be allowed for users in Ukraine, and only when “the context is the Russian invasion.”

Yesterday, Ryan Mac, Mike Isaac, and Sheera Frenkel reported in the New York Times that the rules about allowing calls for violence against Putin and Lukashenko were actually changed on February 26, two days after Russian troops first entered Ukraine. Following the Reuters story, which was widely shared, “Russian authorities labeled Meta’s activities as ‘extremist,’” the Times wrote. “Shortly thereafter, Meta reversed course and said it would not let its users call for the deaths of heads of state.” (At the Washington Post, Will Oremus noted that Facebook’s changes also enabled users to praise a “Ukrainian neo-Nazi militia that has been resisting the Russian invasion.”) Clegg told staff in a memo that “circumstances in Ukraine are fast moving. We try to think through all the consequences, and we keep our guidance under constant review because the context is always evolving.”

According to the Times, the many changes to content rules has caused confusion inside Facebook and Instagram, which is also owned by Meta. In an unusual step for the company, the Timesreported, Meta “suspended some of the quality controls that ensure that posts from users in Russia, Ukraine and other Eastern European countries meet its rules.” Sources told the Times that Meta temporarily stopped tracking whether its workers who monitor Facebook and Instagram posts were accurately enforcing its content guidelines because those workers couldn’t keep up with the shifting rules about what kinds of posts were allowed about the war in Ukraine. The result of the rule changes “has been internal confusion, especially among the content moderators who patrol Facebook and Instagram for text and images with gore, hate speech and incitements to violence,” the Times wrote.

While some content is removed automatically by algorithms and other software, much of it is left to human moderators, who are on contract—and who, according to the Times story, are often given less than 90 seconds “to decide whether images of dead bodies, videos of limbs being blown off, or outright calls to violence violate the platform’s rules.” Moderators also said they were shown posts about the war in the Chechen, Kazakh, and Kyrgyz languages, which they did not know and so could not properly review.

The rule books governing platform speech and content permissions “need a new chapter on geopolitical conflicts,”Oremus wrote for the Post. Companies such as Meta may feel that their approach to Ukraine is the correct one, but “they haven’t clearly articulated the basis on which they’ve taken that stand, or how that might apply in other settings, from Kashmir to Nagorno-Karabakh, Yemen and the West Bank.” Katie Harbath, a former public policy director at Facebook, told Oremus, “We’re all thinking about the short term” in Ukraine. Harbath added that she would prefer to see the platforms “building out the capacity for more long-term thinking,” noting, “The world keeps careening from crisis to crisis. They need a group of people who are not going to be consumed by the day-to-day.”

Emerson Brooking, a fellow at the Atlantic Council’s Digital Forensic Research Lab, wrote in a piece for Tech Policy Press that moderation is supposed to stem the spread of violent content, but “wars are exercises in violence, fueled by cycles of hate. Accordingly, social media companies will never be able to write a sufficiently nuanced wartime content policy that somehow elides violence, hate, and death.” Meta’s struggles, he argued, “demonstrate an irreconcilable tension in trying to adapt content moderation policy to major conflict.”

Here’s more on the platforms and war:

  • Scale: Contributing to the moderation challenges at a company like Meta or Google is the vast scale of these platforms. Evelyn Douek, a lecturer at Harvard Law School and research scholar at the Knight First Amendment Institute, gave a talk last year at Stanford called “The Administrative State of Content Moderation.” In the 30 minutes it took for her to give the presentation, Douek noted, Facebook would have taken down 615,417 pieces of content, and YouTube about 271,440 videos and channels.
  • Blunders: In addition to its moderation challenges, Meta has made some high-profile mistakes as well, the Times noted. For example, it allowed a group called the Ukrainian Legion to run ads on its platforms this month in an attempt to recruit foreign fighters to assist the Ukrainian army, which is a violation of international laws. Meta later removed the ads—which were shown to people in the United States, Ireland, Germany and elsewhere—because the group may have misrepresented its ties to the Ukrainian government.
  • Orders: Google allegedly told translators not to use the word “war” to describe what’s happening in Ukraine, according to The Intercept—another example of how the legal requirements dictated by operating in a country can hamstring platforms and lead to censorship. “An internal email sent by management at a firm that translates corporate texts and app interfaces for Google and other clients said that the attack on Ukraine could no longer be referred to as a war but rather only vaguely as ‘extraordinary circumstances,’” wrote Sam Biddle and Tatiana Dias.
  • Erasure: In 2014, The Atlantic wrote about how Facebook’s decision to shut down dozens of pages set up by dissidents in Syria “dealt a significant blow to peaceful activists who have grown reliant on Facebook for communication and uncensored—if bloody and graphic—reporting on the war’s atrocities.” At the time, Eliot Higgins, a former blogger who founded Bellingcat, an open-sourced investigative journalism site, also complained that Facebook was making it difficult to document atrocities in Syria because it kept removing the evidence.

Speak Your Mind

*