1 year, 8 months ago

Deep dive into Meta’s algorithms shows that America’s political polarization has no easy fix

WASHINGTON — The powerful algorithms used by Facebook and Instagram to deliver content to users have increasingly been blamed for amplifying misinformation and political polarization. In response to the papers, Meta’s president for global affairs, Nick Clegg, said the findings showed “there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or has any meaningful impact on key political attitudes, beliefs or behaviors.” Katie Harbath, Facebook’s former director of public policy, said they showed the need for greater research on social media and challenged assumptions about the role social media plays in American democracy. “To me, it reinforces that when it comes to polarization, or people’s political beliefs, there’s a lot more that goes into this than social media.” One organization that’s been critical of Meta’s role in spreading misinformation about elections and voting called the research “limited’ and noted that it was only a snapshot taken in the midst of an election, and didn’t take into account the effects of years of social media misinformation. Free Press, a non-profit that advocates for civil rights in tech and media, called Meta’s use of the research ”calculated spin.” “Meta execs are seizing on limited research as evidence that they shouldn’t share blame for increasing political polarization and violence,” Nora Benavidez, the group’s senior counsel and director of digital justice and civil rights said in a statement. “Studies that Meta endorses, which look piecemeal at narrow time periods, shouldn’t serve as excuses for allowing lies to spread.” The four studies also revealed the extent of the ideological differences of Facebook users and the different ways that conservatives and liberals use the platform to get news and information about politics.

Discover Related