Banned on the Internet: How Facebook Monitors and Enforces Appropriate Content Standards

Banned on the Internet: How Facebook Monitors and Enforces Appropriate Content Standards

The business of online gatekeeping is not what you think.

Written by Michael Thomsen (@mike_thomsen)

 

There are no standards on the Internet, but the illusion must be maintained. For corporations that host their own communities the task of defining and enforcing standards is a work-a-day job done with few and mostly unclear rules at a pace that guarantees improper enforcement. In an investigation into Internet bullying, Emily Bazelon writes about a Facebook page, Let's Start Drama, created by an anonymous student at Woodrow Wilson Middle School in Connecticut. The page became a trading post for rumors about who had been exchanging sexts, who had lost their virginity, and running polls about which student was prettier.

Let's Start Drama drew more than 500 followers from a school of 750 and was soon discovered by the Middletown Youth Services Bureau, which filed several complaints with Facebook. Facebook didn't respond to any of the Bureau's reports and the page remained live for months. Bazelon visited Facebook's harassment team, responsible for reviewing reports like the one sent about Let's Start Drama, and discovered the primary concern is not protection of Facebook's users but the speed with which reports can be evaluated. 

 

The need for standards on the Internet is a desire to fight against the wild and terrible phenomena of our culture in a medium that merely documents phenomena.

 

"We optimize for half a second," Nick Sullivan, one Facebook's monitors, told Bazelon. "Your average decision time is a second or two, so 30 seconds would be a really long time." In the case of Let's Start Drama, it turns out the page had been deemed okay and two separate evaluators had indicated that future reports about the page should be ignored. Facebook is aware of the importance of appearing to be responsible for social standards, yet the company is more interested in there being an efficient structure for enforcement more than it is ensuring its standards make sense. Alexis Madrigal cited an acknowledgement of this weakness in the company's annual report, where Facebook admitted it "may fail to provide adequate customer service, which could erode confidence in our brand. Our brand may also be negatively affected by the actions of users that are deemed to be hostile or inappropriate to other users, or by users acting under false or inauthentic identities."

There is always some transference of responsibility that happens when new technologies make old social maladies visible again. While Facebook has the power to amplify our worst tendencies, it is also a trap to think we can address the impulse to bully and antagonize by creating a set of enforceable standards that make antagonistic behavior less visible. The process of monitoring what people will do when presented an open-seeming platform to publish their thoughts, curiosities, or beliefs is inextricably bound to the fact that many of our thoughts, curiosities and beliefs are horrific.

Stay Connected with
Complex Tech
Tags: facebook, google, censorship
blog comments powered by Disqus