The Most Damning Facebook Story Yet

From The Poynter Report with Tom Jones: “The most damning Facebook story yet”:

The stream of unsettling stories about Facebook based on internal documents obtained by news organizations continues its steady flow….There’s just so much that it’s hard to know where to start and there is a concern that the impact will be lost just by the sheer volume of it all.

Then comes a story such as the one by Jeremy B. Merrill and Will Oremus of The Washington Post. It produced what I think is the most jolting and infuriating paragraph I’ve read so far in the dozens of articles based on The Facebook Papers.

Five years ago, Facebook gave users different ways to react to something posted on its site. You’re familiar with them — things such as the classic thumbs-up “like,” as well as “love,” “haha,” “wow,” “sad” and “angry.”

Now for the jaw-dropping paragraph from The Post:

Behind the scenes, Facebook programmed the algorithm that decides what people see in their news feeds to use the reaction emoji as signals to push more emotional and provocative content — including content likely to make them angry. Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business.

Internal documents showed that some within Facebook realized the hornet’s nest that could be whacked. One staffer said that favoring these controversial posts could open “the door to more spam/abuse/clickbait inadvertently.” A colleague responded, “It’s possible.”

The Post wrote, “The warning proved prescient. The company’s data scientists confirmed in 2019 that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity and low-quality news. That means Facebook for three years systematically amped up some of the worst of its platform, making it more prominent in users’ feeds and spreading it to a much wider audience.”…

It’s the backbone of the reason former Facebook employee Frances Haugen blew the whistle. Haugen testified before British lawmakers this week that, “Anger and hate is the easiest way to grow on Facebook.” She went on to say, “Bad actors have an incentive to play the algorithm. The current system is biased towards bad actors, and people who push people to the extremes.”

Oremus, one of the reporters on Post story, tweeted Tuesday, “To me this is not a story of Facebook intentionally fanning anger for profit. It’s a story of how arbitrary initial decisions, set by humans for business reasons, become reified as the status quo, even as evidence mounts that they’re fueling harms.”

He added, “The initial choice to weight reaction emojis as 5x likes was aggressive, but not indefensible: They signaled greater engagement. Yet that arbitrary choice became part of the firmament; the burden shifted to integrity staff to prove it could be changed without denting engagement. Anger isn’t bad *per se.* When attached to real harm, it can deter bad behavior, drive change. But anger also isn’t good *per se.* When systemically incentivized on a network that lacks publication controls or fact checks, it rewards misinfo, hate, & rage-bait.”

Speak Your Mind