May 5, 2024
Opinion | One weird trick to solve political polarization? Nope.

Opinion | One weird trick to solve political polarization? Nope.

New research on Meta — better known as Facebook — seemed to absolve social media sites from responsibility for the nation’s fractured politics. In fact, the new research offers only one sure conclusion: There is no “one weird trick” to fix political polarization.

Three papers in Science and one in Nature by independent academics arrived this week with conflicting headlines, depending on who was writing them. Meta found the results exonerating — evidence that the behind-the-scenes machinations of its algorithm, which is responsible for what users see, don’t meaningfully divide the electorate. The scientists who wrote the reports, meanwhile, say it’s complicated. The scientists are right.

The experiments, conducted in the months surrounding the 2020 election and with Meta’s permission, showed that changes to the algorithm within that time period did little to alter people’s beliefs. The studies looked at what happens when “reshared” posts are hidden from view; when Instagram and Facebook feeds are displayed in reverse chronological order rather than curated by the algorithm; and when material from “like-minded” sources is reduced by one-third. Attitudes, it turned out, didn’t change much: The anti-immigrant user largely stayed anti-immigrant, while crusaders for covid-19 restrictions lost none of their gusto. The likelihood that users would vote didn’t change, either.

None of this, however, means social media doesn’t matter in the country’s politics. The experimental tweaks did change, considerably, the content to which users were exposed. And one study discovered that, under the standard Facebook setup, Facebook looks dramatically different to liberals and conservatives. Liberals see a mix of sources on the left and right, whereas the news that conservatives see trends ideologically homogenous — more likely to be rated false by the platform’s third-party fact-checkers. Conservatives are, as one researcher put it, “far more siloed in their news sources,” a reality that is “driven in part by algorithmic processes.”

Also, in the experiment in which the researchers reduced material from sources who shared users’ views, the scientists found that those users engaged more with the “like-minded” content that remained.

Together, these findings suggest a possible conclusion less favorable to Meta: When the data for these studies was being collected after months of election-related conversation and years of online habits building and communities coalescing, it was already too late. People’s political beliefs might already have been baked in, with social media sites one of the many actors doing the baking. There’s also no discounting the way living in the world of the like, retweet and viral video has shaped the behavior of politicians and plebeians alike.

The notion that Meta and its peers are solely responsible for national polarization and division has always been a straw man. But the idea that these platforms have nothing to do with our divisions is just as dubious.

What the new studies show is that altering or altogether scrapping today’s algorithms won’t magically fix the country’s problems. The 12 studies still to come from this collaboration might offer some more insight into what will help. But so will continued research — conducted as platforms, ideally, make continued changes. Meta should open its inner workings even wider to academic insight; other platforms, including those such as Twitter and Reddit that have recently restricted research access, should join in.

Researchers haven’t yet seen, for instance, the consequences of disfavoring posts that cause outrage; they haven’t seen the outcomes produced by “bridging systems,” algorithms that boost posts that appeal to diverse audiences. They have more to learn about the impact of third-party fact-checks, community notes on Twitter (now X) and other interventions designed to fight misinformation. And they haven’t seen what occurs when a site gives users themselves more control over what they see — as Meta plans to do with its microblogging app, Threads.

This week’s studies aren’t an excuse for platforms to stop encouraging healthier atmospheres. They’re a reason to keep going, so academics can keep studying. Maybe, with enough effort over enough time, something will actually change.

The Post’s View | About the Editorial Board

Editorials represent the views of The Post as an institution, as determined through debate among members of the Editorial Board, based in the Opinions section and separate from the newsroom.

Members of the Editorial Board and areas of focus: Opinion Editor David Shipley; Deputy Opinion Editor Karen Tumulty; Associate Opinion Editor Stephen Stromberg (national politics and policy); Lee Hockstader (European affairs, based in Paris); David E. Hoffman (global public health); James Hohmann (domestic policy and electoral politics, including the White House, Congress and governors); Charles Lane (foreign affairs, national security, international economics); Heather Long (economics); Associate Editor Ruth Marcus; Mili Mitra (public policy solutions and audience development); Keith B. Richburg (foreign affairs); and Molly Roberts (technology and society).

Source link