The experiments, conducted in the months surrounding the 2020 election and with Meta’s permission, showed that changes to the algorithm within that time period did little to alter people’s beliefs. The studies looked at what happens when “reshared” posts are hidden from view; when Instagram and Facebook feeds are displayed in reverse chronological order rather than curated by the algorithm; and when material from “like-minded” sources is reduced by one-third. Attitudes, it turned out, didn’t change much: The anti-immigrant user largely stayed anti-immigrant, while crusaders for covid-19 restrictions lost none of their gusto. The likelihood that users would vote didn’t change, either.
None of this, however, means social media doesn’t matter in the country’s politics. The experimental tweaks did change, considerably, the content to which users were exposed. And one study discovered that, under the standard Facebook setup, Facebook looks dramatically different to liberals and conservatives. Liberals see a mix of sources on the left and right, whereas the news that conservatives see trends ideologically homogenous — more likely to be rated false by the platform’s third-party fact-checkers. Conservatives are, as one researcher put it, “far more siloed in their news sources,” a reality that is “driven in part by algorithmic processes.”
Also, in the experiment in which the researchers reduced material from sources who shared users’ views, the scientists found that those users engaged more with the “like-minded” content that remained.
Together, these findings suggest a possible conclusion less favorable to Meta: When the data for these studies was being collected after months of election-related conversation and years of online habits building and communities coalescing, it was already too late. People’s political beliefs might already have been baked in, with social media sites one of the many actors doing the baking. There’s also no discounting the way living in the world of the like, retweet and viral video has shaped the behavior of politicians and plebeians alike.
The notion that Meta and its peers are solely responsible for national polarization and division has always been a straw man. But the idea that these platforms have nothing to do with our divisions is just as dubious.
What the new studies show is that altering or altogether scrapping today’s algorithms won’t magically fix the country’s problems. The 12 studies still to come from this collaboration might offer some more insight into what will help. But so will continued research — conducted as platforms, ideally, make continued changes. Meta should open its inner workings even wider to academic insight; other platforms, including those such as Twitter and Reddit that have recently restricted research access, should join in.
Researchers haven’t yet seen, for instance, the consequences of disfavoring posts that cause outrage; they haven’t seen the outcomes produced by “bridging systems,” algorithms that boost posts that appeal to diverse audiences. They have more to learn about the impact of third-party fact-checks, community notes on Twitter (now X) and other interventions designed to fight misinformation. And they haven’t seen what occurs when a site gives users themselves more control over what they see — as Meta plans to do with its microblogging app, Threads.
This week’s studies aren’t an excuse for platforms to stop encouraging healthier atmospheres. They’re a reason to keep going, so academics can keep studying. Maybe, with enough effort over enough time, something will actually change.
More News
When Prison and Mental Illness Amount to a Death Sentence
ABC News’ President, Kim Godwin, to Step Down
Opinion | Marjorie Taylor Greene Is Not as Powerful as She Thinks She Is