When Eli Pariser coined the term filter bubble a decade ago, he narrowly defined it to be a situation in which algorithms skew the variety of information we get online in favor of stuff we like. At the time, he worried that might lead to political polarization through less exposure to divergent viewpoints.
Ten years later, America is in the aftermath of a hyperpartisan presidential election in which people not only disagree with those on the other side, they actively hate them. Different parties are operating in what seem like different realities, with different sets of facts or at least completely different reactions to those facts.Social media seems to be making a bad situation worse.
A divided America was on full display as record rates of voters turned out, in part to vote their own guy in but, perhaps more trenchantly, to keep the other guy out. Biden won by a close 4.6 million votes (so far) in an election with an expected turnout of 159 million voters. Some of the political lessons we learned recently include that conspiracy theories have real currency fake news can spread faster than real facts on social media and that if we dont agree on a shared reality, more mundane things like political compromise will remain out of reach.
Filter bubbles are undeniably part of the problem, but their causes, consequences, and solutions seem less clear than they used to be. Part of the issue is its often difficult to understand which comes first: a polarized situation or the social media that aggravates that situation. Rather, its become a self-reinforcing system.
The inputs are also the outputs, Pariser told Recode recently, describing how our differences are magnified online. Where I live and who my friends are and what media I consume all shape what I see, which then shapes decisions I make about what media I consume and where to live and who to be friends with.
And by a number of accounts, those inputs are getting more extreme.
The inputs
Were living at a heightened time of partisanship and not in the we have differences of opinion kind of way. And those divides have a way of compounding through social media.
Animosity toward members of opposing parties is very high even though our divisions over policy preferences dont appear to have grown, according to new research published in Science magazine. The paper brings together a number of different studies on the topic and is written by scholars from six disciplines who found that, these days, were more likely to hate the opposing side and consider them to be different, dislikable, and immoral. The result is a political sectarianism in which ones party identity seems to come first, before policy, religion, or common ground. Political identity, in turn, shapes our other views instead of the other way around. For example, after seeing a clip of Donald Trump espousing a liberal policy, followers exhibited a more liberal attitude, according to the paper, which presumes Democrats would do the same for their political leaders.
The results of this kind of alignment are disastrous for a functioning democracy. As the researchers argue, holding opposing partisans in contempt on the basis of their identity alone precludes innovative cross-party solutions and mutually beneficial compromises.
Other researchers believe severe inequality puts America at the brink of political violence. BuzzFeed recently wrote about a so-called political stress indicator, created in part by a sociologist who previously led the CIAs State Failure Task Force, that incorporates a variety of statistics, including wage stagnation, national debt, inequality, and distrust of government, among others. The indicator currently shows American political instability to be in line with the lead-up to the Civil War.
Then theres distrust encouraged by the president of facts and journalism organizations, which are necessary to protect democracy. A series of Pew Research Center polls shows that Republicans rely on and trust fewer news sites for politics than they used to, with Fox News, Trumps mouthpiece and a fount of disinformation, being one of few sources they regularly read and believe. However, research by Andy Guess, assistant professor of politics and public affairs at Princeton University, looks at web traffic rather than peoples survey responses to reveal that theres considerable and consistent overlap in media consumption between the parties, except among a smaller set of extremists. This suggests many people might be reading the same sources but coming to totally different conclusions. Wildly divergent interpretations of the same news is a more difficult problem to fix.
The outputs
Hyperpartisanship, tense societal factors, and divergent news diets or at least divergent interpretations of the news are then fed back through social media, which is likely amplifying our divisions. We dont know exactly how the social media algorithms work that select what information we see because the technology is a black box controlled by the respective social media company that built it.
What we do know is that Facebook has put less of an emphasis on news and more on engagement, and that posts with strong, emotional language have more engagement. We also know Facebook has continuallypromoted Groups since 2016, which can function as their own echo chambers, even without algorithmic help. YouTube, whose algorithms like other platforms were designed to make people spend more time on the site, has been shown to radicalize people through inflammatory messaging. Most recently, it has been awash in election misinformation.
Theres little doubt in my mind that the way our media ecosystem works is enflaming political sectarianism, Eli Finkel, one of the authors of the aforementioned Science paper, told Recode. Social media is not focused on making the world a better place; its primarily focused on engagement, so it listens to us and gives us what we want.
We also know that more people are getting their news from social media. The share of Americans who often get their news from social media grew 10 percentage points to 28 percent last year, according to Pew. Those who mainly get their news that way were also less informed about current events and more likely to have been exposed to conspiracy theories.
Funnily enough, despite getting so much news from social media, Americans dont trust it.
Distrust about social media platforms is one of few places Republicans and Democrats agree, Katerina Eva Matsa, associate director of journalism research at Pew, told Recode.
A new study from the University of Virginia found increased Facebook usage among conservatives is associated with reading more conservative sites than they normally do. The effect was less dramatic among liberals.
The studys authors conjectured that the way Facebook works might have something to do with this outcome. In addition to algorithms favoring engagement, the very structure of Facebook limits who we talk to: You have to friend others to see their posts, meaning youre less likely to see people outside of your real-life friends and family, who are more likely to have similar lives and viewpoints. Facebook also tweaked its algorithms after the 2016 election to promote posts from friends and family and show far fewer posts from news outlets, which likely further contributed to filter bubbles and division.
These platforms at this point are huge, theyre mature, they have all sorts of resources, all sorts of ability to figure out ahead of time and certainly monitor afterwards what kind of impacts all of their algorithmic tweaks are having on users information consumption, study co-author Steven L. Johnson told Recode.
Pariser had originally thought filter bubbles could be deflated by exposure to different viewpoints. But that doesnt appear to be the case.
Research highlighted in the Wall Street Journal suggests that people on social media do see opposing viewpoints. But since sites like Facebook are calibrated to highlight posts that elicit reactions, were seeing the most acerbic of opposing views, which can lead people to be even more repelled by them. The result is even more entrenched viewpoints and more polarization.
What to do about it
The answer to all this polarization isnt easy. It likely involves huge structural changes in our society to deal with inequality, investments in public institutions like schools and libraries, as well as serious, truthful, and mediated discussion between people of opposing viewpoints.
It doesnt require more social media at least not in its current iteration.
Its not just a matter of coming into contact with the other side, Pariser told Recode about how his conception of filter bubbles has changed since he first coined the term. Its doing so in a way that leads us to greater understanding.
And the way opposing views are presented on social media is not leading to greater understanding. Conversations on Facebook and Twitter, for example, happen through text, not voice, which can be another impediment to understanding. Pariser pointed to work by Berkeley professor Juliana Schroeder that shows conversations through text fail to garner the empathy evoked through hearing someones voice, making it an unproductive medium for constructive conversations.
Its sort of like the dont have an argument by text with your partner [rule], but for society, Pariser said.
Mediating thoughtful in-person conversations, of course, is expensive, and it has been even harder to do during a pandemic, when being face-to-face with people outside your household is actually dangerous.
But its more necessary than ever.
Social media companies are constantly tinkering with how their platforms work, so perhaps thats a way forward. In the lead-up to the election, Twitter and Facebook were able to limit misinformation spread on their platforms much more effectively than they had before, by more actively taking down or marking misinformation and limiting its spread. But as New York Times tech columnist Kevin Roose pointed out, these platforms did so by mitigating their own major features.
Social media companies may not have a choice if the government steps in and forces them to make changes. They are under increased scrutiny from the government to quash misinformation and to disclose how the information they see might differ from what others see. And advertisers, social medias main source of revenue, can also choose to vote with their spending.
However we get there, its high time we deal with how social media has made a bad political situation worse.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.
Will you help keep Vox free for all?
Millions of people rely on Vox to understand how the policy decisions made in Washington, from health care to unemployment to housing, could impact their lives. Our work is well-sourced, research-driven, and in-depth. And that kind of work takes resources. Even after the economy recovers, advertising alone will never be enough to support it. If you have already made a contribution to Vox, thank you. If you havent, help us keep our journalism free for everyone by making a financial contribution today, from as little as $3.