The Filter Bubble and the News

The filter bubble is a technological phenomenon, where one’s opinions are amplified by algorithms that recommend content that one is more likely to be interested in, while filtering out all other content (Flaxman, Goel and Rao 2016, 299). If, for example, Google’s search algorithms have learned that you are a liberal person, the results of political search queries may be more likely to be liberal than conservative. And if you watch a lot of horror movies on Netflix, you are more likely to see suggestions for these types of movies in the future.

In his book, “The Filter Bubble” (2011), Eli Pariser, tells the story of how journalism has gone from being a passive receiving of information by a few publishers, to an overwhelming wealth of articles produced by both professionals and amateurs. This creates a problem of how articles are being presented to the reader. Sense no one is capable of reading every article being produced, some filtration has to take place. The problem is that when this filtration is based on algorithms filtering information based on what they think we like, people are less likely to be exposed to new ideas and challenging information.

I believe that the Filter Bubble is potentially a serious problem for democracy and public debate. I also believe, however, that it is necessary and a result of the natural development of the digital world. In his book, “The Googlization of Everything” (2011), Siva Vaidhyanathan argues that the number of available information online leads to information overload. The very title of one of his chapters, “The Googlization of Memory”, hints at how our very human and biological processes – such a as memory – is being digitally expanded. If one accepts the wealth and availability of information online as an extension of our memory, then there must also be an extension of our biological filtration processes and working memory, that – just like the algorithms of the filter bubble – filtrate information based on what is believed to be in our interest.

It is difficult to find a balance between the the necessary algorithmic filtration systems and the democratic dangers of the filter bubble. For starters, I do miss the option to turn off filtration for a while – an exploration mode where information is presented that is not based on any guesses of what I might like. And hopefully, awareness of the filter bubble will help people become more critical of their news sources.

 

Sources

Flaxman, Seth, Sharad Goel, and Justin M. Rao. 2016. “Filter bubbles, echo chambers, and online news consumption”. Public Opinion Quarterly 80, no. S1: 298-320

Pariser, Eli. 2011. The filter Bubble: What the Internet Is Hiding from You. London: Penguin Books

Vaidhyanathan, Siva. 2011. The Googlization of Everything. Berkeley: University of California Press.

The Filter Bubble and the News

The filter bubble is a technological phenomenon, where one’s opinions are amplified by algorithms that recommend content that one is more likely to be interested in, while filtering out all other content (Flaxman, Goel and Rao 2016, 299). If, for example, Google’s search algorithms have learned that you are a liberal person, the results of political search queries may be more likely to be liberal than conservative. And if you watch a lot of horror movies on Netflix, you are more likely to see suggestions for these types of movies in the future.

In his book, “The Filter Bubble” (2011), Eli Pariser, tells the story of how journalism has gone from being a passive receiving of information by a few publishers, to an overwhelming wealth of articles produced by both professionals and amateurs. This creates a problem of how articles are being presented to the reader. Sense no one is capable of reading every article being produced, some filtration has to take place. The problem is that when this filtration is based on algorithms filtering information based on what they think we like, people are less likely to be exposed to new ideas and challenging information.

I believe that the Filter Bubble is potentially a serious problem for democracy and public debate. I also believe, however, that it is necessary and a result of the natural development of the digital world. In his book, “The Googlization of Everything” (2011), Siva Vaidhyanathan argues that the number of available information online leads to information overload. The very title of one of his chapters, “The Googlization of Memory”, hints at how our very human and biological processes – such a as memory – is being digitally expanded. If one accepts the wealth and availability of information online as an extension of our memory, then there must also be an extension of our biological filtration processes and working memory, that – just like the algorithms of the filter bubble – filtrate information based on what is believed to be in our interest.

It is difficult to find a balance between the the necessary algorithmic filtration systems and the democratic dangers of the filter bubble. For starters, I do miss the option to turn off filtration for a while – an exploration mode where information is presented that is not based on any guesses of what I might like. And hopefully, awareness of the filter bubble will help people become more critical of their news sources.

 

Sources

Flaxman, Seth, Sharad Goel, and Justin M. Rao. 2016. “Filter bubbles, echo chambers, and online news consumption”. Public Opinion Quarterly 80, no. S1: 298-320

Pariser, Eli. 2011. The filter Bubble: What the Internet Is Hiding from You. London: Penguin Books

Vaidhyanathan, Siva. 2011. The Googlization of Everything. Berkeley: University of California Press.