The Facebook Papers: 7 Key Takeaways From Release of Whistleblower Documents

Mark Zuckerberg and company have been on the defense in recent weeks due to the onslaught of news coverage surrounding whistleblower Frances Haugen.

Mark Zuckerberg wears a suit.
Getty

Image via Getty/Sven Hoppe/picture alliance

Mark Zuckerberg wears a suit.

In recent days, a slew of potentially damaging Facebook insights have been published by a variety of outlets, all under the title of “the Facebook Papers.”

In short, these “Facebook Papers”—now at the center of an investigative project including extensive reporting from at least 17 U.S. outlets—mark a more thorough look into the internal documents and related criticism that first made headlines earlier this year in connection with whistleblower Frances Haugen.

Speaking with 60 Minutes for an interview shared in early October, Haugen—a data scientist who later testified before Congress—highlighted the company’s “conflicts of interest” and warned that the current version of Facebook is “tearing our societies apart.” Facebook CEO Mark Zuckerberg, in response, shared a lengthy post to Facebook in which he argued that news coverage of the testimony and surrounding issues “misrepresents our work and our motives.”

Fast forward a few weeks, and the general public—through the release of coordinated reports on redacted documents—has a better opportunity to sift through the findings surrounding the oft-criticized social media behemoth. Below, we’ve rounded up a few highlights from the “Facebook Papers” project. 

Complex also reached out to a Facebook spokesperson, who offered the following on Monday:


“At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook.”

The mechanics of the Facebook platform, as conceded in an August 2019 internal memo that’s mentioned in a New York Times report, “are not neutral.” Put another way, researchers were noting that the basic functionality of the platform was itself allowing the spread of misinformation and hate speech.

Here’s more, straight from the Times-cited memo:


“If integrity takes a hands-off stance for these problems, whether for technical (precision) or philosophical reasons, then the net result is that Facebook, taken as a whole, will be actively (if not necessarily consciously) promoting these types of activities. The mechanics of our platform are not neutral.”

In preparation for the 2020 presidential election, which Donald Trump lost to former Obama era VP Joe Biden, Facebook put in place so-called “break the glass” measures and other feature limitations. Still, as detailed here by NPR, the site wasn’t successful in fully halting the spread of Stop the Steal and Stop the Steal-adjacent content.

Notably, and as previously touched on in previous coverage of the Haugen-provided documents focused on the election, similarly aligned groups were able to spread on the platform despite an eventual halt on the main Stop the Steal page.

The company has responded to the Capitol riot-focused aspect of the Facebook Papers rollout. Read more from Guy Rosen, Facebook’s VP of Integrity, here.

In SEC complaints, Haugen—per the Washington Post—makes multiple mentions of Zuckerberg’s public comments while remarking on his level of influence over the company. 

Haugen points out several examples of this, with special attention having since been placed on stats about hate speech. 

Zuckerberg testified in 2020 that Facebook removes 94 percent of identified hate speech. But internal documents, per the Post, show that researchers had actually estimated that Facebook was removing “less than 5 percent” of hate speech on the platform.

While Haugen has received the bulk of media attention, and—as a whistleblower—understandably so, multiple documents included in the “Facebook Papers” project show that other Facebook employees had expressed frustration with the site’s impact on real-world events and how those events were being handled by leadership.

Per CNN, some shared critical responses to execs’ comments on the fatal Capitol riot. One employee, for example, wondered why the company hadn’t yet figured out how to exist without “enabling violence” through its management of discourse

“We’ve been fueling this fire for a long time,” that employee said.

Another employee, as spotted by the The Atlantic, pointed to the larger historical importance of the moment. Namely, they argued that future historical dissections of the events of Jan. 6 wouldn’t (and shouldn’t) include flattering depictions of the social media company.

“History will not judge us kindly,” that employee said.

One of the most challenging issues facing the U.S. and beyond during the pandemic era has been the prevalence of misinformation, particularly with regards to vaccines (which are safe and effective and easy to get).

In a pandemic-focused deep dive from the Associated Press, examples of Facebook researchers proposing quick fixes for the brewing misinformation problem are mentioned, including an instance in which employees suggested “altering how posts about vaccines are ranked” in newsfeeds. Doing so, they said, could help slow the spread of misleading information and instead promote legitimate info from trusted sources.

Some suggestions from that study, however, were “shelved,” while others weren’t put in place until April. Another researcher suggestion, this time focused on temporarily disabling comments on vaccine-related posts, “was ignored.”

Per the Associated Press, Apple threatened to remove Facebook (as well as the Facebook-owned Instagram) from the App Store in connection with “concerns” that the social media platform was being utilized to “trade and sell maids in the Mideast.”

Facebook, a rep for which told the AP in a recent statement that it prohibited human exploitation “in no uncertain terms,” ultimately vowed for a crackdown on such content, resulting in no App Store interruptions. Analysis documents cited in among the latest reports, meanwhile, are said to show that Facebook was aware of a “domestic servitude” problem as far back as 2018.

While aspects of the Apple threat previously made headlines earlier this year, the Facebook Papers project provides even more insight into the situation.

Not only has Facebook “been selective” in its curbing of hate speech and misinformation in India, as the Associated Press reported, but it’s also been argued to have not matched its expansion efforts with appropriate protection pushes in other areas outside the company’s home country.

A 2020 summary document, as mentioned in another Washington Post report, noted that 84 percent of Facebook’s “global remit/language coverage” budget was reserved for the U.S. For the rest of the world, that allocation was set at 16 percent. 

Meanwhile, in an Oct. 23-dated newsroom release, Facebook said it takes a “comprehensive approach in countries that are experiencing or at risk for conflict or violence.”

Latest in Life