6 years, 11 months ago

How Facebook Decides What Content to Remove

Facebook for the first time on Tuesday released the internal guidelines it uses to moderate content on the platform. Monika Bickert, Facebook’s vice president of global policy management, explained the decision to publish the guidelines in a press release: First, the guidelines will help people understand where we draw the line on nuanced issues. In the section addressing graphic violence, the guidelines refer to images of cannibalism and “visible internal organs” as particular markers of objectionable content. Users have long contested that Facebook’s enforcement policies are inconsistent and opaque, and moderators have had to reverse controversial decisions to remove content because of public uproar. In fact, the House Judiciary Committee will hold a hearing on Thursday on “social media filtering and policing practices,” which will in part examine how particular viewpoints may face censorship on digital platforms.

Slate

Discover Related