The Impact of Echo Chambers on Learning
We all know the phrase (and song) “Be careful what you ask for…
The news items we read daily and the social media feeds we browse have a function. In addition to providing entertainment and information, the goal is to teach algorithms what kinds of content we find interesting. Algorithms use a variety of criteria, such as our prior preferences and behaviour, to filter and rank information. While they can be helpful in a variety of ways, like preventing boring messages, they can also contribute to a problem known as an echo chamber.
What is an Echo Chamber?
Echo chambers are created from the tendency to ignore or minimise information that contradicts our preexisting beliefs while actively seeking out and favouring information that supports them. Making opinions or developing conclusions based on selective or incorrect information, without being consciously aware of having done so.
What are some notable examples of echo chambers?
Regardless of its accuracy, algorithms prioritize content that is likely to go viral. The "Plandemic" documentary is a prime example of how viral content can thrive within echo chambers. Another, more recent example of a more targeted approach is Cambridge Analytica. The data analytics firm harvested Facebook user data to target voters with highly personalized political ads, exploiting algorithms to identify and influence individuals within specific a confirmation bias.
How do Algorithms Interact with Them
The purpose of algorithms is to learn from human choices and actions. Algorithms record our online interactions, such as liking, sharing, or clicking on particular articles, and use this information to tailor our future interactions. To illustrate, TikTok’s algorithm for recommending videos to users is most affected by time spent per video. Continuing from that information, TikTok may consider other factors, such as user interactions (likes, comments, playthrough, and playtime) and creator quality (publish rate and creator monetization).
Each video consumed gives TikTok a new insight into the user, which is why it so often seems like the app may know us better than we know ourselves. We are therefore more likely to see content that supports our preexisting opinions and passions, forming a kind of "filter bubble" that shields us from different points of view.
How To Avoid Being in One
One thing that echo chambers have difficulty working with is curiosity. Notice when information seems to fit like a glove on your preexisting thoughts and then read news stories from unusual sources, or start following individuals on social media who hold opposing views. It doesn’t sound fun, but neither does being able to tell fact from fiction.
Remember that anyone can create a social media account or website and pay for it to be promoted. We shouldn’t be taking content at face value. Spend some time evaluating fresh information that you come across, and check if it’s valid. Take note of the information's source, the evidence backing it, and any possible prejudices held by the author or source.
And finally, be careful what you ask for" because the content that algorithms provide, while initially convenient, can limit your exposure to diverse perspectives and lead to a less informed worldview.