After dedicating the previous post to an overview of filter bubbles and concluding that they might not even exist, it’s only fair to spend some time discussing their commonly intertwined term echo chambers. Perhaps a more common term, echo chambers discuss how spaces can be polarized by individuals as we surround ourselves with others who consistently reinforce our opinions (Sunstein, 2001). Focused on the individuals we connect with, echo chambers are curated through users and platforms specifically ignoring, removing or cutting individuals who disagree with us out of these spaces (Bruns 2019). However, echo chambers have equally come under scrutiny for their perceived existence.
Before I go further into the work that discredits echo chambers, I want to lay out the specific differentiations between echo chambers and filter bubbles. As discussed in the last blog post, filter bubbles involve a collaboration between user and algorithm to actively engage with content of their own ideas and exclude or disengage from content they agree with (Bruns, 2019). The more active we are in our engagement and disengagement, the stronger our filter bubble becomes (Bruns, 2019). Echo chambers are focused on the individuals we connect with, and the active ignoring of other individuals to build ourselves an ‘isolated’ network (Bruns, 2019). To simply summarize, as their name suggests, filter bubbles distill the content that we see based upon our actions and the platform’s studying of our actions, while echo chambers are created by an acceptance and active exclusion of ideas. These two terms go hand in hand, both involving users and systems interacting to create them digitally.
Despite establishing relatively usable definitions, literature in the past few years has begun to question and discredit the existence of both systems. People are exposed to content outside of their beliefs and opinions on a regular basis, and many actively choose to research information they hear (Dubois & Blank, 2018). Dubois and Blank (2018) suggest that previous research around the curation of echo chambers, focused on specific case studies and ignored the larger digital climate. As users, we engage in platforms beyond one social media sites, while also maintaining connections with individuals we potentially disagree with such as extended family. At the conclusion of their work, Dubois and Blank (2018) point to the dangers of opinion leaders as the key individuals in curating ideological segregation and mistrust. This claim matches with the concerns of Bruns (2019), who argues that the hyper-right has done an excellent job in discrediting media and pushing others as voices in their agenda. While I think that these are valuable claims, I am not as quick to discredit the existence of echo chambers and filter bubbles playing a role in how people perceive their ideas within the public sphere.
Concerns over an opinion leader and discrediting of media, brings us to a conversation of trust. Trust appears to be fragmented today, where the hyper-capitalism of the news media has caused it to lose some credibility among the public. This is only furthered by claims of fake news, digital interference (such as Cambridge analytica), and misunderstood conversations around free speech. While I am not getting into those issues specifically, trust within the digital space (and our lives in general) is something of importance and should be studied. Part of our critical analysis is asking ourselves why we trust a source, and doing the work to be able to trust the information we hear. Interestingly, in previous preliminary work I did on defining digital privacy, trust arose as a valuable keyword. In this work, trust played a dichotomous role; on the one hand, machines, computers and code were seen as unbiased and trustworthy, while companies and corporations behind them were viewed skeptically. Despite extensive literature discussing how biased and untrustworthy code, AI and machine learning can be, our culture has curated a trust in machines to output values and products that are true.
I think that echo chambers and filter bubbles do exist, but not in the all-encompassing manner that early authors suggest (Pariser, Negroponte and Sunstein). I agree with the critiques of Bruns, Dubois and Blank, however aspects of these systems still exist and can influence behaviour, decisions and opinions. While we are exposed to ideas and opinions outside of our personal beliefs, our current environments are still supporting the majority of our perspectives, which paints outside material be in a specific light. For example, I have friends within my social network who share content that I disagree with, but its limited appearance within my feeds and non-dominant rhetoric compared to other content makes it have a very small impact. Or perhaps, we watch a video that is “portraying the other side” but is extremely biased in its delivery, exposing us to content and ideology at the same time. Understanding this, changes how we should view filter bubbles and echo chambers. They are more open and fluid systems, however they slightly skew our perceptions of content, complementing the objectives of opinion leaders, and narratives that discredit bodies such as the media. We need to talk about both issues, the systems and the content within them. While there is a range of underlying issues for each, so are the continual questions that come up.
In any case, the next question is always so what do we do? Simple answers are to make sure you fact check claims and stories you are reading on social media, maintain a critical mindset to the content you are fed, and seriously consider the arguments and opinions of those counter to your own. However, those suggestions put a lot of responsibility on you as a user, despite stating earlier that users and algorithms play a role. While digital literacy is an important skill as well, perhaps we need to do a larger exploration of these platforms and how they function. Part of what this blog will do, is explore ways to highlight these issues in a game, and using the game to provide a counter narrative promoting critical thought, the exchange of information and open dialogue between parties. I hope to further some of these questions, and provide one potential option to help solve this growing digitally apocalyptic scenario (yes that is a tad dramatic).
Note: This is the second of 3 blog posts around echo chambers, filter bubbles and my MA work
Bruns, A. (2019). Are Filter Bubbles Real? John Wiley & Sons.
Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745. https://doi.org/10.1080/1369118X.2018.1428656
Negroponte, N. (1995). Being digital. Vintage Books.
Sunstein, C. R. (2001). Echo Chambers: Bush V. Gore, Impeachment, and Beyond. Princeton University Press.
Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.
Rajan, A. (2019, March 4). Do digital echo chambers exist? BBC News. Retrieved from https://www.bbc.com/news/entertainment-arts-47447633
Taylor, A., & Sadowski, J. (2015). Digital Red Lining. Nation, 300(24), 24–27.