Facebook Execs Reportedly Gave Up on Solutions to Decrease Partisan Divide on Platform

According to a report published by 'The Wall Street Journal,' Facebook executives nixed or lost interest in proposed solutions to tamp down partisan divide.

Getty

Image via Getty/picture alliance

Mark Zuckerberg

According to a report published on Tuesday by The Wall Street Journal top executives at Facebook either weakened or completely threw out solutions that had been proposed after an internal investigation of theirs found evidence that the social media juggernaut had been intensifying divisiveness and polarization amongst its users. 

This issue (which you probably already knew existed even without a formal investigation, though it certainly helps as evidence to buttress the claim) was discovered after a research project was launched in 2017 by the company's ex-Chief Product Officer, Chris Cox. The study came about in response to the Cambridge Analytica Scandal, and had been intended to learn how Facebook algorithms were actually exacerbating content that can be both divisive and dangerous. This, of course, is pretty ironic coming from a company whose stated intent is/was to bring people together.

Employees at Facebook are reported to have found that some groups would end up uniting people from varying backgrounds, but others ended up promoting division through conflict and inaccurate info. 

A 2018 document published after the company discovered this was reported to have backed Mark Zuckerberg's point of view that Facebook should be about "free expression," and the company ended up deciding that they wouldn't "build products that attempt to change people’s beliefs." 

Like all tech companies, Facebook has been hit with a torrent of criticism regarding conspiracy theories, misinformation, and echo chamber garbage being housed/promoted on their platform. That criticism has only intensified in 2020 due to A.) The COVID-19 pandemic, and B.) The fact that this is a presidential election year. 

The Wall Street Journal's report goes on to state that a 2016 presentation informed executives at Facebook that "64 percent of all extremist group joins are due to our recommendation tools," and that a majority of those joins were pushed by Facebook's algorithms for 'Groups You Should Join' and 'Discover.'

“Our algorithms exploit the human brain’s attraction to divisiveness,” said a slide in a presentation on the company's own report. “If left unchecked, more and more divisive content" could go up "in an effort to gain user attention & increase time on the platform.”

Some proposed solutions to the matter are said to have included limiting groups' most partisan users, suggesting a wider collection of groups that people might not otherwise come across, and also making subgroups for people to have their more heated arguments so as not to derail the entire/larger group. 

The WSJ went on to say that these proposals were either dismissed or greatly watered down, at the direction of Zuckerberg and Joel Kaplan, Facebook's policy chief. Eventually Zuckerberg is said to have lost interest in trying to deal with the problem.

You can read the entire report here.

Latest in Life