Here’s What Will and What Won’t Get Your Post Taken Down on Facebook

The social media behemoth has published its internal guidelines, with 27 pages outlining the company's stance on various issues.

Since the Cambridge Analytica data scandal, Facebook has faced its most tumultuous period in history. From losing $70 billion in 10 days, to CEO Mark Zuckerberg testifying in Congress, and the social media behemoth making an effort to banish hate groups from its platform—the company has been quite busy, to say the least. Today, for the first time in Facebook history, the updated internal guidelines in regards to violence, hate speech, and nudity have been published for the public.

According to The Daily Dot, the 27-page document is essentially a blueprint for the company’s moderators, so that posts containing violence, harassment, pornography or abuse can be more easily identified and removed. Additionally, Facebook is adding a feature allowing users to appeal a post’s removal if the new strictures were incorrectly applied—with a 24-hour turn-around reconsideration period. 

Facebook has previously offered moderator guidelines up to the public, but never in such detail and extensive length. Added to the new level of clarity for users are guidelines suggesting how to react to threats, bullying, nudity or fake news found on the platform, with these rules being translated into 40 languages. 

“The vast majority of people who come to Facebook come for very good reasons,” said Monika Bickert, Vice President of global product management at Facebook. “But we know there will always be people who will try to post abusive content or engage in abusive behavior. This is our way of saying these things are not tolerated. Report them to use, and we’ll remove them.” 

The guidelines are categorized into six distinct classifications: violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property, and content-related requests. Each classification contains subcategories—“hate speech” and “graphic violence” are under the “objectionable content” umbrella, for example. While this certainly seems like a step toward cleaning up the platform’s content and an effort to maintain integrity, the decision to urge users to report on each other seems like an ominous approach that not everyone will appreciate. 

As for the minutia of said objectionable content, there are some caveats. While the company doesn’t allow content depicting animal or human abuse with captions that endorse it, Facebook does allow photos of “throat-slitting” or “victims of cannibalism” as long as there’s a warning screen and age restriction tied to the post. Surely, this won’t be enough for some users, while going too far into censorship for others. It’s a fine line being towed here, with no total satisfaction for all in sight. 

The company has decided to employ 10,000 additional safety, security and product and community operation employees to their team before the year is up, with weekly audits reviewing various decisions intended to refine any taken or considered steps and internal choices. While Bickert said she feared terrorists or hate groups would simply adapt to these new guidelines and sidestep any potential censorship or removal, she is hopeful that the efforts being undertaken will lead to net-positive results. 

“I think we’re going to learn from that feedback,” she said. “This is not a self-congratulatory exercise. This is an exercise of saying, here’s where we draw the lines, and we understand that people in the world may see these issues differently. And we want to hear about that, so we can build that into our process.”

Hopefully, Facebook will absorb what is useful, and reject what is useless, and establish itself as a flawed—but constantly improving—place for people to share their views. 

Read the updated internal guidelines and community standards here.

Latest in Life