Toxic Traits of Media Platforms: Unmasking the Shadows of Algorithmic Culture
/By Modesty Chang
Diversity & Wellbeing Subcommittee Member
Ever noticed how those quick, catchy TikTok videos seem to hook you in, one after another? "Share controversial opinions," they say. It's a tactic that gets clicks and keeps people engaged, but at what cost? Lies, scapegoating, over-exaggeration, and unrealistic standards bombard our mental health in 15-second bursts. And the worst part? Most of us don't even realize how we're being pulled into this consumerist vortex.
This phenomenon is a direct result of algorithmic culture — where algorithms dictate what we see based on what gets the most likes, shares, and views. Social media managers have become masters at playing this game, crafting posts that maximize engagement. The higher the engagement, the more likely the content gets pushed to a wider audience.[1]
But here's the catch: this system creates echo chambers. You end up seeing only what aligns with your existing beliefs, reinforcing your views without exposing you to diverse perspectives.[2] It's like living in a bubble where the same ideas bounce around, making it hard to see beyond your own little world.
The Misinformation Minefield
One of the nastiest side effects of algorithmic culture is the spread of misinformation. Algorithms prioritise content that grabs attention, even if it's not true. Remember that viral video by a YouTuber showing how fake celebrity news can be easily created and spread?[3] (Just in case you are wondering – the main character was Harry Styles – go check the video out!) It went viral for all the wrong reasons, highlighting how even reputable news platforms can get duped. They, too, fall into the trap of prioritising clicks and views over accuracy.
This shows the profit-driven motives of tech companies. They control these algorithms to shape cultural narratives to their advantage, all while keeping the workings of these algorithms a secret.[4] This lack of transparency makes it hard to hold them accountable, creating a commercial black box that shields them from scrutiny.[5]
Moving Forward: The Path to Digital Literacy and the Online Safety Act 2021
So, what can we do about it? Promoting digital literacy is a start. We need to think critically about the content we consume. Be aware of the biases and values these algorithms might be embedding in what you see and hear.[6]
Enter the Online Safety Act 2021,[7] a game-changer in addressing the toxic traits of media platforms. This legislation holds tech companies accountable for the content they host, focusing on user safety and reducing harm. It pushes for transparency in how algorithms work and requires better content moderation. With these rules, platforms must take responsibility for curbing misinformation and breaking the cycle of echo chambers. By enforcing these standards, the act helps ensure that our digital world prioritizes truth and user well-being over mere clicks and profits.
In the end, while algorithms can enhance our online experiences, we must be vigilant about their potential to distort reality and manipulate our perceptions. By fostering digital literacy and demanding greater transparency, we can strive for a digital environment that values truth and fairness over clicks and profits.
[1] Metzler, H., & Garcia, D. (2023). Social Drivers and Algorithmic Mechanisms on Digital Media. Perspectives on Psychological Science. https://doi.org/10.1177/17456916231185057
[2] Talamanca, G & Arfini, S. (2022). Through the Newsfeed Glass: Rethinking Filter Bubbles and Echo Chambers. National Library of Medicine.
[3] GeorgeMason TV. (2022). How I tricked the internet into thinking I was Harry Styles... https://www.youtube.com/watch?v=NhYCKep-yas&t=379s
[4] Kopelman, S., & Frosh, P. (2023). The “algorithmic as if”: Computational resurrection and the animation of the dead in Deep Nostalgia. New Media & Society. https://doi.org/10.1177/14614448231210268
[5] Hristova, S et al. (2022). Algorithmic Culture: How Big Data and Artificial Intelligence are Transforming Everyday Life. New York Lanham.
[6] Tsamados, A; Aggarwal, N; Cowls, J et al. (2021). The ethics of algorithms: key problems and solutions. AI & Soc. https://doi.org/10.1007/s00146-021-01154-8
[7] Online Safety Act 2021 (Cth).