How Facebook Can Prove It Doesn’t Discriminate Against Conservatives
During last week’s congressional hearings investigating the accessing of data from 87 million Facebook users by Cambridge Analytica, several Republican lawmakers worried that companies like Facebook might have a “liberal bias” in the way it enforces its rules and moderates content posted by users. But the deeper claims of censorship were somewhat misleading: Research by ThinkProgress suggests that video content from across the political spectrum was made less visible after recent changes to Facebook’s algorithms. Facebook, like many other social media platforms, is facing increasing scrutiny about potential bias in the way it makes decisions about what people are allowed to post online and what content becomes visible. They check the content against a version of the company’s highly detailed internal policies, which are not made available to the public, like what specific phrasing targeting which specific groups would constitute a violation.
Discover Related

Social-media companies decide content moderation is trending down

Social media moderation: How does it work and what is set to change?

Facebook's algorithm doesn't alter people's beliefs: Research

Control and delete: The Hindu Editorial on government appellate panels for social media

Facebook to change rules on attacking public figures on its platforms

Parler’s Return Shows the Content Moderation Conversation Needs to Change

Facebook and Twitter chart out different paths for Congress on internet regulation

Facebook Keeps Data Secret, Letting Conservative Bias Claims Persist

Facebook knew its algorithm made people turn against each other but stopped research

Facebook CEO defends refusal to take down some content

The Cleverness of Trump’s “Social Media Summit”

Facebook moderation rulebook leak: Is it possible to govern the Internet?
