Content Towards Creation

So I started this whole blog about how I was going to be making a game, and then spent my first two posts talking about digital filter bubbles and echo chambers. However, to properly document the research process for this project it was pivotal that I did some work on understanding the content that I would put into the game. Those posts were informed by research and interviews with professors at a variety of institutions (who I cannot thank enough for giving me time out of their lives to help). But now that we have gotten a handle on what we mean by filter bubbles and echo chambers, we can talk about how they can fit into a game.

Based on the conducted research the game has already had a bit of a change. Rather than explicitly focusing on these ubiquitous concepts, the game will focus on what we understand by digital personalization. Echo chambers and filter bubbles both fall under this larger term, and it better recognizes them outside of binary perceptions. Personalization simply refers to the individual specific curation of content on a platform. Such as your social media feed vs my social media feed. However this extends beyond social media. Search engines, online shopping sites, and news organizations can all personalize the content you see. While sometimes beneficial, this process can quickly create mini echo chambers or filter bubbles. One example could be the recent claims linking youtube’s platform and pedophile rings (see Fisher & Taub, 2019). These systems are complex, hard to understand, and hard to even research. This brings us back to my MA.

A central goal of the project was to present digital issues within an analog space. This goal raises its own set of questions, which only become furthered by the concept of personalization. How do you personalize a stagnant space? How do you make a game feel unique to each player if you physically construct it? How do you represent an algorithm in a real-world setting? The list keeps going, and as I worked on the blueprint for the game I still have questions on how this will all come together. I will write later on the brainstorming process as well as process for design, specifically addressing the methodology piece. 

Creating an analog game that is personalized to users forces me to think about red herrings as a personalized experience cannot feel mixed up. The escape room genre offers a strong setting, an investigative story, and controls focused on piecing together parts. In this manner, the space needs to feel personalized, yet the choices are somewhat prescribed. Rather than total player autonomy, I had to remember that the game can have explicit restrictions that provide some agency but ultimately get the players to do what I want. While I am still working on some specifics, the environment offers a distinct space to have users see their experience reflected back to them. 

Moving forward, the blog posts will be discussing specific design choices, questions I came across, and my overall methodology. As I prepare to playtest next month, I have a lot to solve in a short time frame. 

References:

Fisher, M., & Taub, A. (2019, June 3). On YouTube’s Digital Playground, an Open Gate for Pedophiles. The New York Times. Retrieved from https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html

ECHO, Echo, echo…

After dedicating the previous post to an overview of filter bubbles and concluding that they might not even exist, it’s only fair to spend some time discussing their commonly intertwined term echo chambers. Perhaps a more common term, echo chambers discuss how spaces can be polarized by individuals as we surround ourselves with others who consistently reinforce our opinions (Sunstein, 2001). Focused on the individuals we connect with, echo chambers are curated through users and platforms specifically ignoring, removing or cutting individuals who disagree with us out of these spaces (Bruns 2019). However, echo chambers have equally come under scrutiny for their perceived existence.

Before I go further into the work that discredits echo chambers, I want to lay out the specific differentiations between echo chambers and filter bubbles. As discussed in the last blog post, filter bubbles involve a collaboration between user and algorithm to actively engage with content of their own ideas and exclude or disengage from content they agree with (Bruns, 2019). The more active we are in our engagement and disengagement, the stronger our filter bubble becomes (Bruns, 2019). Echo chambers are focused on the individuals we connect with, and the active ignoring of other individuals to build ourselves an ‘isolated’ network (Bruns, 2019). To simply summarize, as their name suggests, filter bubbles distill the content that we see based upon our actions and the platform’s studying of our actions, while echo chambers are created by an acceptance and active exclusion of ideas. These two terms go hand in hand, both involving users and systems interacting to create them digitally. 

Despite establishing relatively usable definitions, literature in the past few years has begun to question and discredit the existence of both systems. People are exposed to content outside of their beliefs and opinions on a regular basis, and many actively choose to research information they hear (Dubois & Blank, 2018). Dubois and Blank (2018) suggest that previous research around the curation of echo chambers, focused on specific case studies and ignored the larger digital climate. As users, we engage in platforms beyond one social media sites, while also maintaining connections with individuals we potentially disagree with such as extended family. At the conclusion of their work, Dubois and Blank (2018) point to the dangers of opinion leaders as the key individuals in curating ideological segregation and mistrust. This claim matches with the concerns of Bruns (2019), who argues that the hyper-right has done an excellent job in discrediting media and pushing others as voices in their agenda. While I think that these are valuable claims, I am not as quick to discredit the existence of echo chambers and filter bubbles playing a role in how people perceive their ideas within the public sphere.

Concerns over an opinion leader and discrediting of media, brings us to a conversation of trust. Trust appears to be fragmented today, where the hyper-capitalism of the news media has caused it to lose some credibility among the public. This is only furthered by claims of fake news, digital interference (such as Cambridge analytica), and misunderstood conversations around free speech. While I am not getting into those issues specifically, trust within the digital space (and our lives in general) is something of importance and should be studied. Part of our critical analysis is asking ourselves why we trust a source, and doing the work to be able to trust the information we hear. Interestingly, in previous preliminary work I did on defining digital privacy, trust arose as a valuable keyword. In this work, trust played a dichotomous role; on the one hand, machines, computers and code were seen as unbiased and trustworthy, while companies and corporations behind them were viewed skeptically. Despite extensive literature discussing how biased and untrustworthy code, AI and machine learning can be, our culture has curated a trust in machines to output values and products that are true. 

I think that echo chambers and filter bubbles do exist, but not in the all-encompassing manner that early authors suggest (Pariser, Negroponte and Sunstein). I agree with the critiques of Bruns, Dubois and Blank, however aspects of these systems still exist and can influence behaviour, decisions and opinions. While we are exposed to ideas and opinions outside of our personal beliefs, our current environments are still supporting the majority of our perspectives, which paints outside material be in a specific light. For example, I have friends within my social network who share content that I disagree with, but its limited appearance within my feeds and non-dominant rhetoric compared to other content makes it have a very small impact. Or perhaps, we watch a video that is “portraying the other side” but is extremely biased in its delivery, exposing us to content and ideology at the same time. Understanding this, changes how we should view filter bubbles and echo chambers. They are more open and fluid systems, however they slightly skew our perceptions of content, complementing the objectives of opinion leaders, and narratives that discredit bodies such as the media. We need to talk about both issues, the systems and the content within them. While there is a range of underlying issues for each, so are the continual questions that come up.

In any case, the next question is always so what do we do? Simple answers are to make sure you fact check claims and stories you are reading on social media, maintain a critical mindset to the content you are fed, and seriously consider the arguments and opinions of those counter to your own. However, those suggestions put a lot of responsibility on you as a user, despite stating earlier that users and algorithms play a role. While digital literacy is an important skill as well, perhaps we need to do a larger exploration of these platforms and how they function. Part of what this blog will do, is explore ways to highlight these issues in a game, and using the game to provide a counter narrative promoting critical thought, the exchange of information and open dialogue between parties. I hope to further some of these questions, and provide one potential option to help solve this growing digitally apocalyptic scenario (yes that is a tad dramatic).

Note: This is the second of 3 blog posts around echo chambers, filter bubbles and my MA work

References:

Bruns, A. (2019). Are Filter Bubbles Real? John Wiley & Sons.

Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745. https://doi.org/10.1080/1369118X.2018.1428656

Negroponte, N. (1995). Being digital. Vintage Books.

Sunstein, C. R. (2001). Echo Chambers: Bush V. Gore, Impeachment, and Beyond. Princeton University Press.

Additional Reading:

Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.

Rajan, A. (2019, March 4). Do digital echo chambers exist? BBC News. Retrieved from https://www.bbc.com/news/entertainment-arts-47447633

Taylor, A., & Sadowski, J. (2015). Digital Red Lining. Nation, 300(24), 24–27.

Filter Bubbles – Are They Already Popped?

Personalized web pages, search results and news feeds fill most of our computer screens every time we go online. With it comes promoted content that we might enjoy seeing and recommended products we are more likely to purchase. However, what is being left out? If we are having everything personally targeted towards us, is there not some content that isn’t making it to us? What is being filtered out?

As algorithms filter content to create these webpages they are curating what some call a filter bubble. In 2011, Eli Parisier explored this idea, discussing how the personalization of the web is shaping the content we see, and subsequently us (Pariser, 2011). His work echoes mounting concerns that first arose in 1995 when Nicholas Negroponte discussed the notion of the Daily Me in news curation. While still at the advent of the internet, Negroponte discussed how individuals could create their own personal news catalogue that could be digitally sent to them (Negroponte, 1995). Jumping forward to Parisier, we see how our agency has become intertwined with technology to create these personal feeds that extend beyond news (Parisier, 2011). Negroponte’s notion of users specifically choosing the content is not fully the case, rather algorithms play a role by selecting content based on our actions/preferences/likes/shares etc. 

Now, before moving forward with the concerns and issues of these filters, it’s important to address the arguments and notions overall. First off, the concept of algorithmic filtering, profiling, and sorting is somewhat ambiguous. Algorithms are pervasive, and understanding them really varies towards context. Additionally, many algorithms are created by corporations who refuse to share their code, making them mostly ‘blackboxed’. It’s also important to note that recently, conversations around AI start to further complicate our understandings of algorithms. However, for this project (and blog post) I will maintain a focus on social media algorithms and the knowledge that we currently have around them. From this perspective, we can understand algorithms are curated lines of code that are designed to gather, analyze, sort, profile, and suggest information related to users on a platform (Lyon, 2009; van Dijck, 2013; Gillespie, 2014). 

When trying to understand these ominous filter bubbles, maintaining a comprehension of the factors that create them is important. In case you have not heard it before, the product of social media sites are the users themselves. The sites maintain a free entrance for users, by selling your data to advertisers (Srnicek, 2016). What gathers this data? The algorithms embedded into the platform. They track user actions on these sites harvesting data from any action on these platform some being likes, scrolls, clicks, curated content, reading user messages for keywords. When first recognizing these habits occuring, authors such as Andrejevic discussed the concern of the algorithmic all seeing eye (Andrejevic, 2002). Beyond the privacy and surveillance concerns (which are critically important), others highlight how this data gathering and subsequent profiling establish content channels that can manipulate users perceptions. Jose van Dijck (2013) discusses how the filtering of content can influence behaviour and thought. 

While filter bubbles appear to be a quite alarming part of our digital culture, however it is important to separate mounting concerns from growing discourse. Axel Bruns just wrote a book (releasing September 16, 2019) that explores this notion of the filter bubble. Bruns provides both a working definition and a strong critique of these concerns. Bruns (2019) recognizes the dichotomous role of user and system in their creation, understanding filter bubbles as communicative spaces where content is shared and excluded based on the users within the system. For example, if you are sharing and liking specific content and choosing to ignore or actively remove other content from your feed, you will strengthen your filter bubble (Bruns, 2019). In this relationship, users and systems feed off of each other to ‘entrench’ themselves in the bubble. However, the concern and hype is potentially overrated?

Bruns (2019), alongside some others (Dubois & Blank, 2018), argue that while these systems exist users are still exposed to content outside of their bubble. For example, we have friends on our social media who will share content we disagree with, or we engage with news outlets that publish articles that challenge our ideas. In the majority of current dialogue, filter bubbles are used in relation to moral panic, they are the scapegoats to other concerns. In actuality, Bruns (2019) argues that we need to think about what we do with the information we are gathering. To quote him, 

The problem isn’t that there are hyperpartisan echo chambers or filter bubbles; it’s that there are hyperpartisan fringe groups that fundamentally reject, and actively fight, any mainstream societal and democratic consensus […] The filter is in our heads. (2019)

So what is it? Are there filter bubbles? Is it make belief? What do we do next? Taking this knowledge back to the active research aspect of my project, it turns into arguments on what information is critical for the average user. Brun’s critique does not change the gameplan of addressing these issues. While slowly becoming a loaded term, Education is centrally important for attacking these concerns. Raising awareness on potential of personalization, but more importantly, critically reflecting on the content that we are being shown in these spaces. Being able to challenge the systems we interact with, the information that we are seeing is important, but also understanding what knowledge to trust. Engage in meaningful dialogue, consider multiple opinions, and reflect on where the information is coming from. This leads to the larger questions of my MA, how do we do that in meaningful ways for a variety of audiences?

Note: This post is 1 of 3 that will outline filter bubbles, echo chambers and their relation to this project.

References:

Andrejevic, M. (2002). The work of being watched: Interactive media and the exploitation of self-disclosure. Critical Studies in Media Communication, 19(2), 230–248. https://doi.org/10.1080/07393180216561

Bruns, A. (2019). Are Filter Bubbles Real? Axel Bruns. Retrieved from https://www.youtube.com/watch?v=ouzPhoSSGYw

Dijck, J. van. (2013). Engineering Sociality in a Culture of Connectivity. Retrieved from https://www-oxfordscholarship-com.lib-ezproxy.concordia.ca/view/10.1093/acprof:oso/9780199970773.001.0001/acprof-9780199970773-chapter-1

Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745. https://doi.org/10.1080/1369118X.2018.1428656

Gillespie, T. (2014). The Relevance of Algorithms. Retrieved from https://mitpress.universitypressscholarship.com/view/10.7551/mitpress/9780262525374.001.0001/upso-9780262525374-chapter-9

Lyon, D. (2009). Surveillance, power, and everyday life. The Oxford Handbook of Information and Communication Technologies. https://doi.org/10.1093/oxfordhb/9780199548798.003.0019

Negroponte, N. (1995). Being digital. Vintage Books.

Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. Penguin Group.

Srnicek, N. (2016). Platform Capitalism. Cambridge, MA: Polity Press.

Getting Started Up in Here

“The eternal gulf between being and idea can only be bridged by the rainbow of imagination.” 

— Johan Huizinga, Homo Ludens

Thesis proposal defended, now it’s time to get this blog going. As you will find, this blog is a place for me to document and reflect on my research project. The project, to clarify, is going to be a game. An intergenerational game that presents the issue of filter bubbles and echo chambers. That might sound very vague and confusing to some, and currently it still is for me, but I’ll try to be more specific. I am interested in two things. First, how do we create spaces for people from any generation to play and learn together? Second, how can the issue of echo chambers and filter bubbles be presented and discussed by people of any age? These are the questions that I will try to answer by designing this game. These are the questions I will continue to reflect on in this blog.

Today marks the starting point of this project, the beginning of this MA adventure. I got to sit down with two incredible academics and discuss my project with them. The feedback was great and helps me think about the next steps that I need to take. First thing (other than getting ethics approval), is to expand my knowledge on game design and echo chambers/filter bubbles. Since I will be making an analog (not digital) game, similar to escape rooms, I will need to chat with experts on puzzles, room and game design. However, by working on a serious game, (games that are meant to be somewhat educational) I will also need to talk to experts on digital issues to see how I can frame/present the issue within the game. Information is key, to make this game properly present issues and also properly engage players, I need more information.

The start of any large project is always slightly overwhelming. My goal is to finish the project by April. That means building the game, testing it multiple times, and completing the final write up of it. I know this is doable, I would not have proposed it if I had thought differently, but it is alarming to look at all that I need to do. That one of the reasons I am completing this blog. It will keep me grounded on what I have done, am doing, and what I still need to think about. I know that I will learn a lot through this process and this place will help me record that. 

As you might notice with the quotes around this blog, I am interested in play. I believe that studying play is crucially important and maintaining play in our own lives is important. While this project is one way to do so, I hope this blog becomes a place for me to play with ideas and have fun with this project as I work on my MA. I’m not quite sure what the future holds for me or how this project will look once I am done, but I want to enjoy the experience. This space will be a mix of academic discourse and playful thoughts, moments and ideas. While I am creating this blog for my own work, I do invite you to follow me on this journey.