The filter bubble is floating in hot water these days. “Your filter bubble is destroying democracy,” Wired tells us. Bill Gates says filter bubbles are “super important” and “more of a problem than I, or many others, would have expected.” Mark Zuckerberg worries about them. Even former President Barack Obama now says, “We need to spend a lot of time thinking about them.”
Since Eli Pariser, the former executive director of MoveOn.org and current president of Good Media Group, published his brilliantly prescient book, The Filter Bubble, in 2011, the drumbeat of concern has reached a crescendo over the filter bubble—the belief-reinforcing cognitive isolation spawned by personalized online experiences, such as customized Google search results and algorithm-driven exposure to news on Facebook and Twitter. Focusing on this phenomenon is overdue. But we’re using the wrong metaphor: It’s not a filter bubble but a filter shroud, blanketing each of us and blocking out any ability for outsiders to see what information is being digested underneath it. Understanding this difference is critical to understanding what’s really corrupting our political and social dialogue—and how to address it.
A bubble is, of course, transparent; you can see through it. Tuning into only FoxNews if you’re a conservative or watching only MSNBC if you’re a liberal is fairly described as creating something of a filter bubble. Through your channel choice, you’ve filtered out facts and views you might hear on other stations that could cause you to rethink your beliefs; but, critically, you’re in a bubble that the rest of us can peek into. That is, if I want to understand what your belief-reinforcing, skewed worldview is, I can flip to FoxNews or MSNBC and peer into the bubble you’ve created for yourself. Choosing a cable television station isn’t, of course, as personalized an experience as what Pariser found emerging online; but it’s a crude form of filtering, and it creates something aptly called a “bubble.”
Your experience online is dramatically more belief-reinforcing because Google, Facebook, Twitter, and others have curated your experience to give you only what you want: search results similar to the ones you’ve previously clicked on, stories shared by the friends and family whose pages you spend the most time on, and so on. That’s a difference in degree, though a critical one. But what’s different in kind is that the rest of us can’t even peek into what you’re experiencing. Unlike being able to tune into your cable news station of choice, we have no way of knowing precisely what’s being filtered in and out for you in your increasingly individualized experience of the world through the Internet. That’s not a filter bubble at all. It’s a filter shroud, which hides from the rest of us even basic access to the cycle of self-satisfying reinforcement hardening your beliefs.
There are at least two reasons for that. The first is the personalized nature of today’s online experience. Because, for example, Google tailors search results to the user based on prior searches, previously selected search results, other webpages visited, location, and a whole host of other factors, your experience of Google—and Facebook, and Twitter, and Amazon, and more—is shrouded from my view. Second, the filtering devices themselves—the algorithms that yield this personalized online experience—are famously complicated and closely held secrets. As Frank Pasquale documented in his book, The Black Box Society, “Authority is increasingly expressed algorithmically,” but “the values and prerogatives that the encoded rules enact are hidden within black boxes”—the undisclosed algorithms themselves. That secrecy shrouds from view even the workings of the very filters that generate the individualized experiences of other users.
To get at what’s really distinctive about the shrinking soda straws through which we’re all seeing the world, we need to stop talking about filter bubbles and start talking about filter shrouds. We need to recognize that it’s not just self-reinforcing filtering that’s a problem but also the opacity of the experience produced by that filtering—because, if we don’t know what others are experiencing and others don’t know what we’re experiencing, then all of us have little hope of engaging with each other in a way that revives political and social dialogue in this country and, over time, moves us away from political gridlock and toward productive compromise.
Recognizing that it’s a filter shroud adds urgency to the call that I and others, such as former Federal Communications Commission Chairman Tom Wheeler, have made for radically increased transparency about how tech companies’ algorithms work. That’s an important start, as it can help us understand how alternative views might pierce the shroud and reinvigorate exposure to different perspectives.
But recognizing the nature of the filter shroud points to the need for something even today’s powerful tech companies can’t provide: actual engagement with our fellow Americans whose points of view we understand the least. Even if the companies’ algorithms are made public, we could never recreate for hundreds of millions of users the vast array of inputs that yield personalized online experiences for each of us. That means we still won’t know exactly what each individual is seeing online and then, often without knowing it, choosing to see more of.
To understand that—the real nitty-gritty of each other’s belief systems, as reinforced by our online experiences but ultimately originating in all of our heads—we need to engage with the actual human beings whose preexisting beliefs are most baffling to us yet getting hardened through today’s Internet experience. And that means putting down our phones, looking up, and getting out; going to the bars, barbershops, and ballparks where there’s still hope of encountering people least like ourselves; and talking with them—or, more importantly, listening to them.