Personalized web pages, search results and news feeds fill most of our computer screens every time we go online. With it comes promoted content that we might enjoy seeing and recommended products we are more likely to purchase. However, what is being left out? If we are having everything personally targeted towards us, is there not some content that isn’t making it to us? What is being filtered out?

As algorithms filter content to create these webpages they are curating what some call a filter bubble. In 2011, Eli Parisier explored this idea, discussing how the personalization of the web is shaping the content we see, and subsequently us (Pariser, 2011). His work echoes mounting concerns that first arose in 1995 when Nicholas Negroponte discussed the notion of the Daily Me in news curation. While still at the advent of the internet, Negroponte discussed how individuals could create their own personal news catalogue that could be digitally sent to them (Negroponte, 1995). Jumping forward to Parisier, we see how our agency has become intertwined with technology to create these personal feeds that extend beyond news (Parisier, 2011). Negroponte’s notion of users specifically choosing the content is not fully the case, rather algorithms play a role by selecting content based on our actions/preferences/likes/shares etc. 

Now, before moving forward with the concerns and issues of these filters, it’s important to address the arguments and notions overall. First off, the concept of algorithmic filtering, profiling, and sorting is somewhat ambiguous. Algorithms are pervasive, and understanding them really varies towards context. Additionally, many algorithms are created by corporations who refuse to share their code, making them mostly ‘blackboxed’. It’s also important to note that recently, conversations around AI start to further complicate our understandings of algorithms. However, for this project (and blog post) I will maintain a focus on social media algorithms and the knowledge that we currently have around them. From this perspective, we can understand algorithms are curated lines of code that are designed to gather, analyze, sort, profile, and suggest information related to users on a platform (Lyon, 2009; van Dijck, 2013; Gillespie, 2014). 

When trying to understand these ominous filter bubbles, maintaining a comprehension of the factors that create them is important. In case you have not heard it before, the product of social media sites are the users themselves. The sites maintain a free entrance for users, by selling your data to advertisers (Srnicek, 2016). What gathers this data? The algorithms embedded into the platform. They track user actions on these sites harvesting data from any action on these platform some being likes, scrolls, clicks, curated content, reading user messages for keywords. When first recognizing these habits occuring, authors such as Andrejevic discussed the concern of the algorithmic all seeing eye (Andrejevic, 2002). Beyond the privacy and surveillance concerns (which are critically important), others highlight how this data gathering and subsequent profiling establish content channels that can manipulate users perceptions. Jose van Dijck (2013) discusses how the filtering of content can influence behaviour and thought. 

While filter bubbles appear to be a quite alarming part of our digital culture, however it is important to separate mounting concerns from growing discourse. Axel Bruns just wrote a book (releasing September 16, 2019) that explores this notion of the filter bubble. Bruns provides both a working definition and a strong critique of these concerns. Bruns (2019) recognizes the dichotomous role of user and system in their creation, understanding filter bubbles as communicative spaces where content is shared and excluded based on the users within the system. For example, if you are sharing and liking specific content and choosing to ignore or actively remove other content from your feed, you will strengthen your filter bubble (Bruns, 2019). In this relationship, users and systems feed off of each other to ‘entrench’ themselves in the bubble. However, the concern and hype is potentially overrated?

Bruns (2019), alongside some others (Dubois & Blank, 2018), argue that while these systems exist users are still exposed to content outside of their bubble. For example, we have friends on our social media who will share content we disagree with, or we engage with news outlets that publish articles that challenge our ideas. In the majority of current dialogue, filter bubbles are used in relation to moral panic, they are the scapegoats to other concerns. In actuality, Bruns (2019) argues that we need to think about what we do with the information we are gathering. To quote him, 

The problem isn’t that there are hyperpartisan echo chambers or filter bubbles; it’s that there are hyperpartisan fringe groups that fundamentally reject, and actively fight, any mainstream societal and democratic consensus […] The filter is in our heads. (2019)

So what is it? Are there filter bubbles? Is it make belief? What do we do next? Taking this knowledge back to the active research aspect of my project, it turns into arguments on what information is critical for the average user. Brun’s critique does not change the gameplan of addressing these issues. While slowly becoming a loaded term, Education is centrally important for attacking these concerns. Raising awareness on potential of personalization, but more importantly, critically reflecting on the content that we are being shown in these spaces. Being able to challenge the systems we interact with, the information that we are seeing is important, but also understanding what knowledge to trust. Engage in meaningful dialogue, consider multiple opinions, and reflect on where the information is coming from. This leads to the larger questions of my MA, how do we do that in meaningful ways for a variety of audiences?

Note: This post is 1 of 3 that will outline filter bubbles, echo chambers and their relation to this project.


Andrejevic, M. (2002). The work of being watched: Interactive media and the exploitation of self-disclosure. Critical Studies in Media Communication, 19(2), 230–248.

Bruns, A. (2019). Are Filter Bubbles Real? Axel Bruns. Retrieved from

Dijck, J. van. (2013). Engineering Sociality in a Culture of Connectivity. Retrieved from

Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745.

Gillespie, T. (2014). The Relevance of Algorithms. Retrieved from

Lyon, D. (2009). Surveillance, power, and everyday life. The Oxford Handbook of Information and Communication Technologies.

Negroponte, N. (1995). Being digital. Vintage Books.

Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. Penguin Group.

Srnicek, N. (2016). Platform Capitalism. Cambridge, MA: Polity Press.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s