For three months in late 2020, nearly 7,200 US adults on Facebook and 8,800 on Instagram received a radically different experience than the services’ billions of other users. When they scrolled through their newsfeeds, Facebook and Instagram showed them the newest posts as determined by the clock, not those judged most relevant by an algorithm. The response was clear: Users served chronological feeds got bored quicker and were much more likely to decamp to rivals such as YouTube and TikTok.
That result emerged from a multimillion-dollar, Meta-backed science project designed to study how Facebook and Instagram affected people’s political attitudes during the 2020 US presidential election campaign. The experiment’s main purpose was to add empirical data to the ongoing debate about the role of Facebook and other social media in shaping political choices or even partisan-fueled violence. But the ancillary results showing that users were put off by chronological feeds are perhaps more interesting.
Instagram ditched a chronological option in 2016 over vocal user objections, but it reintroduced it last year, same as Facebook. Some users prefer a chronological option to keep up with live events, and some lawmakers have raised it as an antidote to opaque ranking algorithms that can seal people into information bubbles or drive them toward harmful content.
Yet the new data add to at least two internal Meta studies over the past decade, leaks show, that found displaying posts chronologically caused people to log off. The new results also suggest why, despite regulatory and political pressure, Meta has made it difficult to access alternatives to its standard, algorithm-dominated feeds.
Repellent Option
The new data on Meta users’ chronophobia comes the same week that Instagram added a reverse chronological feed option to its new Twitter-clone, Threads. That update may appease some Twitter exiles and live-news addicts who have been loudly demanding it, but Meta surely will be monitoring closely to watch for signs of disengagement.
“If you think about it, the ranked feed is largely optimized for the consumption and engagement of the viewer—how much time they are spending and interacting,” says Dean Eckles, a social scientist and statistician at MIT who has worked for Meta and testified to US senators about feed design. Companies such as Meta and Twitter train their ranking systems to promote content similar to what users have clicked on, dwelled on, liked, or commented on in the past. Because that approach has proved very effective at holding attention, Eckles says, “any intervention is going to reduce engagement.”
A spokesperson for Instagram did not respond to a request for comment. Facebook spokesperson Corey Chambliss says the service continually makes changes and improvements to its services.
Meta’s big 2020 election project included 17 separate studies, four of which were written up in peer-reviewed research papers published today. The new data on chronological feeds came from one focused on feed effects published in the journal Science.
The randomized, controlled study was designed to determine whether the machine-learning technology that curates and personalizes a user’s feed affects their political attitudes. Since the algorithms of Facebook and others tend to serve content similar to what a person has engaged with in the past, it’s reasonable to assume they can serve users like-minded content and deepen their existing political convictions. And the algorithms also help highlight events and news that could encourage people to participate more in political events.
Mixed Results
A previous study published in 2021 that evaluated Twitter’s feed-ranking algorithm found that it delivered fewer tweets with links to external websites than a chronological one, but that those shown were more likely to point to “junk news”—biased sources that could potentially harden users’ existing political views.
But in the new Meta study, the results came out OK for the platform. Though the thousands of users served the reverse chronological feed from September to December in 2020 encountered more political and untrustworthy content on Facebook and Instagram than users with the standard feed, the change did not significantly affect those users’ political knowledge, attitudes, or behaviors such as likelihood of attending a protest or casting a ballot.
“Our findings rule out even modest effects, tempering expectations that social media feed-ranking algorithms directly cause affective or issue polarization in individuals,” 29 researchers across Meta and 19 universities in the US and Europe concluded. They cautioned that a bigger shift, like switching all users to a chronological feed or extending the study for longer, could have produced different results. “This study just provides part of the picture,” says Eckles, the MIT researcher not involved in the new study.