Cover Image

Illustration by Isabel Ríos

ToxicTok: How TikTok is Radicalizing its Users

The combination of TikTok’s algorithm and content is becoming dangerous to its users, leading to their radicalization and contributing to the decline in their mental health and stability.

Every day for the past six months has felt concerningly similar — wake up, have a cup of coffee, switch between the same three apps and go to sleep. The number of movies and TV shows I watched during quarantine is embarrassing, but even more embarrassing is the number of hours I’ve spent on TikTok.
TikTok is the most downloaded app worldwide as of Jan. 2020. Showing short mobile videos since 2016, TikTok’s mission is to “inspire joy and bring creativity”. During the beginning of quarantine, TikTok’s downloads increased by 27 percent — and I was one of the people that came to the app during this wave. Quarantine pushed people into using TikTok even more. A study shows that users spend around 52 minutes a day on the app.
At first glance, TikTok seems simple: you swipe up to watch a video, you’re entertained for a few seconds, you give the creator a heart and then swipe to the next video. And the cycle repeats itself. This might seem pretty boring — and at first, it is — until the algorithm learns enough about you to curate your content. Every swipe, heart, comment and any other interactions you have on the platform get recorded by the algorithm. This, in turn, determines what the app shows you on your “For You” page, ensuring that it is constantly being tailored to your interests. Anyone who has been on TikTok can tell you that the algorithm becomes insanely accurate after a couple of minutes. And within hours of using TikTok, your FYP becomes completely personalized.
The personalization of the FYP can easily trap someone into their own belief system. Anyone can fall prey to having their reality distorted to epic proportions, especially when one is exposed to a constant stream of information directly determined by one’s beliefs and interests. And TikTok is a supercharged example of this. The loop is rather perfect — it relies on people’s confirmation bias and solidifies their belief system. While Facebook, YouTube and other platforms also create some degree of distortion, TikTok surpasses them by bypassing an important feature: decision fatigue. On other social media platforms, one has a choice — or at least the illusion of a choice — to select the media one wants to consume. TikTok, on the other hand, automatically selects the videos they think you might be interested in. Thus, you are not fatigued by constantly choosing your content, allowing you to spend more time scrolling. Aiding the platform in this regard is the fact that the algorithm also records when you are not interested in a certain topic. This enables the app to make your FYP even more specific.
A recent study conducted by neurologists found that binge-watching TikTok videos activates certain parts of the brain associated with dopamine release and motivation — a similar effect as that induced by illicit drugs. TikTok’s 15-second to one-minute videos are incredibly stimulating: they’re packed with stimuli that shock, inspire and make you laugh or cry. This induced effect creates a vicious cycle in which the user continues to pursue the stimulation, leading to eventual addiction.
What’s so bad about this? So what if my brain is addicted to TikTok? The videos are entertaining, creative and in many instances useful for spreading otherwise neglected information. There are certainly more dangerous things to be addicted to and TikTok has certainly proven useful in connecting people to one another during the pandemic. It has also shed light on important social issues such as the Black Lives Matter movement and other injustices happening all over the world. This is great, but it's not all there is to TikTok.
TikTok’s content is not all filled with sunshine, puppies and positivity. The app’s dark corners are filled with far-right extremists, white supremacists and pro-anorexia accounts waiting to be explored by impressionable young people in search of entertainment, meaning and a sense of community. Hate speech on TikTok flew under the radar until Motherboard reported that messages such as “kill all Jews” and “kill all n******” are being streamed to TikTok’s user base, of which 41 percent are 16 to 24-year-olds. Although TikTok’s guidelines are against hate speech, hashtags such as “#whitegenocide” — a conspiracy theory popular among white supremacists — have received over 60,000 views across 15 videos and “#groyper” — a group of white nationalists and far-right extremists — has 114,000 views across 34 videos.
Another problem that TikTok is facing is the acute romanticization of mental illness. TikTok has been known to fight the stigma around mental illness with many creators being vocal about their issues on the app. However, extremely triggering accounts and videos are popular as well, ranging from teenage creators showing that they’re eating 800 calories a day to videos providing tips and tricks on how to hide your self harm from people around you.
There is a difference between finding a support system for your mental health and consuming videos that are obviously triggering and doing the opposite. Yet, TikTok, like many other platforms, still blurs the lines between support and harm. You might say, So what? The internet can be a dark place and every media platform has these same problems. A quick Google search can yield to infinite forums and sites that propagate these exact same things. So what makes TikTok so different?
As stated above, TikTok’s algorithm is both powerful and addictive. It can form a database on you based not only on your likes and dislikes, but also on your biases and judgments. For example, it can detect that a 13-year-old has a bias toward right-wing content because they liked a TikTok showing Ben Shapiro read the lyrics to “WAP”. Moreover, if you link your social media accounts such as Google, Facebook or Instagram to your TikTok account, they may collect information from them to further curate your TikTok experience. As such, for people who may be struggling with certain insecurities and conducting searches online about them, the algorithm might also capitalize on that and show potentially harmful content. This information is stored and used by TikTok, like other social media platforms that follow the same revenue model, in order to keep you consuming as much content on their platform as possible. The more you watch, the more money they make, the more addicted you become and the more toxic it can be.
Once you get into this loop, it is difficult to get out of it. You keep watching because you’re interested and because it reinforces your belief system and your worldview. You can say that this falls under personal responsibility — it’s your own fault that you developed an addiction and are now a raging Nazi, for example — but the situation is not as simple as that. Toxic TikToks might be easy to notice from the point of view of someone else. But the people who consume that content daily like bread and butter become desensitized and are not likely to report it. The people who ultimately have the most access to these dark corners of TikTok are the same people who want to — or are interested in — consuming that content. Thus, relying on people to report that content would yield minimal results. However, this toxic content is not only limited to these people. Anyone with a similar search history and interests can fall prey to the algorithm. This is the real danger behind TikTok.
The addiction that TikTok causes, the fact that the algorithm knows more about you than you do about yourself and the toxic content that one might encounter on the platform culminates in the radicalization and destabilization of innumerable young people. In the day and age of Covid-19, a time where we depend on the internet to satisfy our socialization needs, this is something that we need to pay attention to. The content that we consume and what we are exposed to inevitably leads us to form our own conclusions about how the world works. If we consume only one kind of content, on repeat, and click “not interested” on things we don’t want to hear or see, this pattern of thinking will ultimately translate into how we perceive real life.
TikTok has a responsibility to address this problem. Unfortunately, however, this is easier said than done, especially since the system as it stands is so profitable. Hence, we, the users, need to realize the power of the algorithm and how it can exploit our own gullibility. The content and media that we consume, at the end of the day, is simply content and media made by people like us. Since it shows just one perspective, which is our own, we shouldn’t base our entire sense of self and reality on it.
Andrijana Pejchinovska is an Opinion Editor. Email her at feedback@thegazelle.org.
gazelle logo