3 years, 5 months ago

How Facebook exploited our cognitive flaws and biases — for profit

The public has been given insight into Facebook's business practices. For example, Facebook's own researchers determined that a fake user who was tagged as "Conservative, Christian, Trump-supporting" would be hit by a deluge of conspiratorial and racist propaganda within days of joining the platform. We don't have the complete answers, but here's what we do know: Facebook designed algorithms that played upon a web of human cognitive biases and social dynamics to maximize engagement and derive profit. Facebook is still not fully transparent about its algorithms, but here is what we do know: Before a user views a given piece of information — whether it's a news report or a post from another person — that information gets filtered to maximize the user's engagement. Research has shown that once a Facebook member joins one extremist group — such as flat-earthing — Facebook will recommend they join interconnected groups, such as those pertaining to anti-vaxxing or chem-trails.

Raw Story

Discover Related