Bad news for the somewhat distant relatives in your life who seem to spend all day sharing, shall we say,
suspect political articles and memes on Facebook — the company is
rolling out a new algorithm that will reduce political content in news feeds.
- Data released by Facebook last year showed that during a week in October 7 of the 10 most engaged pages on the site were political.
- The algorithm change is going to be tested on a fraction of users in Canada, Brazil, and Indonesia starting this week.
How it will work: A machine learning model will predict whether a post is political and surface these posts less frequently.
- The source of the post will apparently make no difference, aside from official government agencies which will be exempt.
Why they're doing it: Facebook has been criticized for a long time for allegedly increasing political polarization. After the Capitol riots, this criticism has intensified and lawmakers have blamed social media platforms — including Facebook — for giving participants a place to plan the attack.
Facebook also says its users have reported disliking the amount of angry political content and arguing on their Facebook feeds.
The bottom line: Facebook says political content makes up a small share of what appears in people's newsfeeds, but the question remains whether that content drives significant engagement with the product. If tests show that reducing political content also reduces the time users spend on the app, we may see Facebook take a different approach.