Is your filter bubble transparent?

No readers like this yet.
single bubble floating in the air

Opensource.com

"There is no standard Google anymore," says Eli Pariser in a recent TED talk. And he's right. Try it. Google the same thing as the person sitting next to you and compare the results. Chances are, they're different. According to Pariser, that's because Google uses as many as 57 different signals to determine the unique search results it serves you. It cross-references (among other things) your computer model, your choice of browser, your geographic location, your search history, the cookies it's stored on your hard drive, and data from your various social networks to determine what you'll most likely want to see when you're curious about something. As best it can, Google is personalizing the world for you.

And Google isn't the only one. Pariser explains that many popular destinations on the Web—like Amazon, Facebook, and Yahoo! News—are dabbling in personalization to ensure their users see only what's "most relevant" to them. (Pariser, President of the Board of MoveOn.org, first became aware of the power of personalization when Facebook suddenly stopped showing him updates from his more conservative friends; because he was less likely to follow the links they posted, Facebook determined those folks weren't really important to him and silently removed them from his news feed.) What's more, he adds, they're doing it without any indication, conducting "invisible algorithmic editing of the Web."

Pariser worries that these algorithmic curators are having some damaging effects. An increasingly personalized Web is one that doesn't present its users with contrarian perspectives or novel viewpoints that might reorient their thinking. It also creates an environment less conducive to pleasant surprises when it minimizes chances for discovering something outside filters' parameters. The aggregate effect of constant personalization creates what Pariser, in his new book, calls a filter bubble—a "personal, unique universe of information that you live in online," as he explains in his TED talk.

Of course, information filters are nothing new. News media, for instance, traditionally play a gate-keeping function, filtering the news of the day into digestible packages fit for consumption in various contexts and on different platforms. So it's tempting to accuse Pariser of simply resuscitating a classic debate: Should information filters curate content to provide citizens with information they want to have, or with information they need to have in order to become well-rounded citizens, regardless of how that information jibes with their sensibilities, comfort levels, political orientations, or personal preferences?

This question, however, presupposes a key human element Pariser seizes to demonstrate how filter bubbles aren't merely an extension of traditional editorial practices. These practices are guided by codes of ethics established through decades of debate in journalism. And arguments that have delineated between audience wants and audience needs typically assume audiences can choose from a variety of information sources, making rational decisions about the kinds of content they include in their balanced media diets. Media have an ethical responsibility, the argument goes, to provide audiences with a balanced information diet, a diverse blend of content from which citizens can draw when developing the critical opinions essential to a healthy democratic society.

Algorithms policing the borders of our filter bubbles are certainly guided by a set of ethics—but not necessarily the same set guiding civic-minded news editors. And they're not working to enhance users' abilities to make careful decisions regarding information sources they find reputable, nourishing, or important. They're filtering content beneath the threshold of ordinary awareness and without direct human intervention, structuring the very field of options users have in the first place, narrowing that field before consumers can even begin to evaluate all the options they might have had.

It might seem that popping these filter bubbles is one way to resist the effects of excessive personalization. But that's another key nuance of Pariser's argument. He doesn't insist that we cease allowing algorithms to filter data for us. After all, its ability to quickly filter, organize, and prioritize a vast collection of heterogeneous bits and pieces is one ability that makes the Internet so darned useful. Instead, he's asking companies that employ filters to be more transparent about precisely what those filters are doing—not only about what's being presented to users, but also, perhaps more importantly, what's being edited out.

"We need you to make sure that they're transparent enough that we can see what the rules are that determine what gets through our filters," he says directly to software architects and decision makers attending his TED talk. "And we need you to give us some control so that we can decide what gets through and what doesn't."

Filter bubbles may be unavoidable. But Pariser wants to make sure these bubbles have permeable walls. Opening them up is one way to do just that.

Bryan Behrenshausen
Bryan formerly managed the Open Organization section of Opensource.com, which features stories about the ways open values and principles are changing how we think about organizational culture and design. He's worked on Opensource.com since 2011. Find him online as semioticrobotic.

6 Comments

Gabriel Weinberg (<a href="http://dukgo.com">DuckDuckGo</a> founder) seems to have similar opinions on these matters.

Those new to the concept of filter bubbles might appreciate the <a href="http://dontbubble.us/">infographic from DuckDuckGo</a>, in addition to reviewing Pariser's TED talk. Anyone concerned about filter bubbles should also read <a href="http://www.gabrielweinberg.com/blog/2011/06/the-real-filter-bubble-debate.html">Weinberg's ideas</a> on the matter.

Most importantly, anyone who wants unfiltered search results (not to mention <a href="https://duckduckgo.com/about.html">less spam</a>, <a href="https://duckduckgo.com/goodies.html">more functionality</a>, and <a href="https://duckduckgo.com/privacy.html">no tracking of personal information</a>) should start using <a href="http://ddg.gg">DuckDuckGo</a> today.

Disclosure - I'm not affiliated with DDG, but I am as fan-boy as they come.

This seems like a valuable service. Thanks for sharing it with us.

It's a little 'spammy' - but I wanted to provide quick access to DuckDuckGo's features and mantra. This way, anyone who is genuinely concerned by filter bubbles or information tracking has at least one avenue of escaping the bubble.

<a href="https://www.torproject.org/">Tor project</a> is another option (and more far-reaching than just search results. There's a great <a href="https://opensource.com/life/11/5/how-browse-anonymously-tor">article on browsing with Tor"</a> here on opensource.com via linux.com.

One of the main reasons for using a search engine - as opposed to following links - is to uncover items that your preconceptions had blinded you from. This "personalization" keeps you bound to your preconceptions.

Yes. Another way to put this might be: Excessive personalization prohibits you from knowing what you don't know.

I would like to see an unfiltered view option for Google alongside images/maps/shopping buttons.

Emailing Matt Cutts now to tell him my search results are full of spam - self generated spam ;-)

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.