Social Media, the “Big Lie,” and the Coming Elections

From CJR’s The Media Today by Mathew Ingram headlined “The social media platforms, the ‘Big Lie,’ and the coming elections”:

IN AUGUST, Twitter, Google, TikTok, and Meta, the parent company of Facebook, released statements about how they intended to handle election-related misinformation on their platforms in advance. For the most part, it seemed they weren’t planning to change much. Now, with the November 8 midterms drawing closer, Change the Terms, a coalition of about 60 civil rights organizations say the social platforms have not done nearly enough to stop continued misinformation about “the Big Lie”—that is, the unfounded claim that the 2020 election was somehow fraudulent. “There’s a question of: Are we going to have a democracy?” Jessica González, a Free Press executive involved with the coalition, recently told the Washington Post. “And yet, I don’t think they are taking that question seriously. We can’t keep playing the same games over and over again, because the stakes are really high.”

González and other members of Change the Terms say they have spent months trying to convince the major platforms to do something to combat election-related disinformation, but their lobbying campaigns have had little or no impact. Naomi Nix reported for the Post last week that coalition members have raised their concerns with platform executives in letters and meetings, but have seen little action as a result. In April, Change the Terms called on the platforms to “Fix the Feed” before the elections, requesting that the same companies change their algorithms in order to “stop promoting the most incendiary, hateful content”; “protect people equally,” regardless of what language they speak; and share details of their business models and approaches to moderation.

“The ‘big lie’ has become embedded in our political discourse, and it’s become a talking point for election-deniers to preemptively declare that the midterm elections are going to be stolen or filled with voter fraud,” Yosef Getachew, a media and democracy program director at Common Cause, a government watchdog, told the Post in August. “What we’ve seen is that Facebook and Twitter aren’t really doing the best job, or any job, in terms of removing and combating disinformation that’s around the ‘big lie.’” According to an Associated Press report in August, Facebook “quietly curtailed” some of the internal safeguards designed to smother voting misinformation. “They’re not talking about it,” Katie Harbath, a former Facebook policy director who is now CEO of Anchor Change, a technology policy advisory firm,told the AP. “Best case scenario: They’re still doing a lot behind the scenes. Worst case scenario: They pull back, and we don’t know how that’s going to manifest itself for the midterms on the platforms.”

Change the Terms first called on the platforms to reduce online hate-speech and disinformation following the deadly 2017 neo-Nazi march in Charlottesville, Virginia; since then, the coalition notes, “some technology companies and social-media platforms remain hotbeds” of such activity, offering the January 6 Capitol insurrection as a prime example. The coalition tried to keep up the pressure on the platforms throughout the past six months to “avoid what is the pitfall that inevitably has happened every election cycle, of their stringing together their efforts late in the game and without the awareness that both hate and disinformation are constants on their platforms,” Nora Benavidez, director of digital justice at Free Press, told the Post.

As Nix notes, the coalition’s pressure on the social-media platforms was fueled in part by revelations from Frances Haugen, the former member of Facebook’s integrity team who leaked thousands of internal documents last year. Haugen testified before Congress that, shortly after the 2020 election, the company had rolled back many of the election-integrity measures that were designed to stamp out misinformation. An investigation by the Post and ProPublica last year showed that a number of Facebook groups became hotbeds of misinformation about the allegedly fraudulent election in the days and weeks leading up to the attack on the Capitol. Efforts to police such content, the investigation found, “were ineffective and started too late to quell the surge of angry, hateful misinformation coursing through Facebook groups—some of it explicitly calling for violent confrontation with government officials.” (A spokesman for Meta said in a statement to the Post and ProPublica that “the notion that the January 6 insurrection would not have happened but for Facebook is absurd.”)

A recent report showed that misinformation about the electionhelped create an entire ecosystem of disinformation-peddling social accounts whose growth the platforms seem to have done little to stop. In May, the Post wrote about how Joe Kent, a Republican congressional candidate, had claimed “rampant voter fraud” in the 2020 election in an ad on Facebook. The ad was reportedly just one of several similar ads that went undetected by internal systems.

YouTube told the Post recently that the company “continuously” enforces its policies, and had removed “a number of videos related to the midterms.” TikTok said it supports the Change the Terms coalition because “we share goals of protecting election integrity and combating misinformation.” Facebook declined to comment, and referred to an August news release listing the ways the company said it planned to promote accurate information about the midterms. Twitter said it would be “vigilantly enforcing” its content policies. Earlier this year, however, the latter company said it had stopped taking steps to limit misinformation about the 2020 election. Elizabeth Busby, a spokesperson, told CNN at the time that the company hadn’t been enforcing its integrity policy related to the election since March 2021. Busby said the policy was designed to be used “during the duration” of an election, and since the 2020 election was over, it was no longer necessary.

More on the platforms:

  • Whiffing it: In Protocol’s Policy newsletter, Ben Brody writes that the election misinformation problem is about more than just the US. “Take Brazil,” he says. “President Jair Bolsonaro appears to be poised to lose his reelection bid, which he kicked off by preemptively questioning the integrity of the country’s vote.” Facebook has already missed a lot of misinformation in Brazil, critics say. In addition, Brody notes, there are potentially contentious elections elsewhere, including in nations “with civic turmoil or tenuous freedom, such as Turkey, Pakistan, and Myanmar. If we want to fix this, we need to acknowledge the problem is bigger than Big Tech whiffing it on content moderation, especially in the U.S.”
  • The time of Nick: Nick Clegg, president of global affairs at Meta, said he will be the one to decide whether to reinstate former president Donald Trump’s account in January of next year, according to Politico. Trump was banned from Facebook for two years in the wake of the January 6 attack on the Capitol. At an event in Washington put on by Semafor, the news startup from former Times media reporter Ben Smith, Clegg said whether to extend Trump’s suspension is “a decision I oversee and I drive,” although he said he would consult with Mark Zuckerberg, Meta’s CEO. “We’ll talk to the experts, we’ll talk to third parties, we will try to assess what we think the implications will be,” Clegg said.
  • Predator and prey: More than 70 lawsuits have been filed this year against Meta, Snap, TikTok, and Google claiming that adolescents and young adults have suffered anxiety, depression, eating disorders, and sleeplessness as a result of their addiction to social media, Bloomberg reports. In at least seven cases, the plaintiffs are the parents of children who’ve died by suicide. Bloomberg said the cases were likely spurred in part by testimony from Facebook whistleblower Haugen, who said the company knowingly preyed on vulnerable young people to boost profits, and shared an internal study that found some adolescent girls using Instagram suffered from body-image issues.

Speak Your Mind