However, the problems associated with TikTok’s powerful algorithm may be far more severe than data security.
Study: Harmful Content Algorithmically Pushed
In the study (pdf), the CCDH established accounts posing as 13-year-old teenagers in the United States, United Kingdom, Canada, and Australia.One account in each nation was assigned a traditional female name. A second account was also created in each country with usernames containing the characters “loseweight” in addition to the name. Researchers used these characters after finding that people with issues like body dysmorphia usually express their situation via usernames.
The team then looked at the first 30 minutes of content recommended by TikTok in these accounts’ “For You” feed. When videos with potentially dangerous content about disorderly eating, self-harm, or mental issues were shown, the researchers would pause and like it, just like a typical teenager.
Every 39 seconds on average, the accounts were served with videos related to body image and mental health. Content referencing suicide was shown on one account within two and a half minutes. One account received an eating disorder content within eight minutes.
Lingering Harms
After the researchers published their study, some videos they had flagged appeared to have been taken down from TikTok, but many of the accounts that posted the material remained while retaining other similar content, according to the Wall Street Journal (WSJ).Users can filter out videos containing words or hashtags they don’t want to see, but the content can still slip through.
According to WSJ, some users have developed creative ways to skirt TikTok’s content filters, such as using a sound-alike “sewerslide” when referencing suicide or just writing “attempt,” leaving the rest to the viewer’s imagination.
Imran Ahmed, CEO of the CCDH, pointed out that TikTok was designed to influence young users into giving up their time and attention. The research proved that the app is “poisoning their minds” as well.
“It promotes to children hatred of their own bodies and extreme suggestions of self-harm and disordered, potentially deadly, attitudes to food,” he said, according to the report.
Abnormal Physical and Mental Changes
The ubiquity of TikTok has been accompanied by a host of deleterious effects, chief among these privacy and mental health concerns.During the COVID-19 pandemic, adolescents spent a tremendous amount of time at home using electronic devices. Subsequently, symptoms similar to tic disorders began to increase among teens.
The report also cited medical journal articles and statistics from doctors and specialists that observed surges of tic-like disorders linked to TikTok usage.
Those videos would encourage their viewers to self-evaluate, and many would recognize themselves in the disorders and become convinced that they had them, thus becoming chagrined and upset.
The report said TikTok videos containing the hashtag #borderlinepersonalitydisorder had been viewed almost hundreds of millions of times.
Growing Adverse Affects on Youth
Social media is driving children and young adults to have a low sense of self-worth and be dissatisfied with their appearances, according to a study published by London-based mental health charity stem4 on Jan. 3.The study, which surveyed 1,024 children and young adults aged between 12 and 21 in the UK, found that 97 percent of them were on social media, spending an average of 3.65 hours a day on smartphone apps, such as TikTok, Snapchat, Instagram, YouTube, and WhatsApp.
It found that 77 percent of the respondents were unhappy about how they looked, with some saying that they were “embarrassed” by their bodies.
Nearly half of those surveyed said they had received negative and hateful comments about their appearance. As a result, 24 percent of them responded by becoming withdrawn, 22 percent began to exercise excessively, 18 percent stopped socializing, 18 percent chose to drastically restrict their food intake, and 13 percent inflicted self-harm.
The survey also found that 42 percent of the respondents—51 percent of females and 31 percent of males—said they were in mental health distress.
Japanese electronics engineer Lee Ji-Shin told The Epoch Times on May 17 that parents should keep their children away from TikTok and guide them to pursue something meaningful.
“TikTok’s algorithm is traffic-focused, its primary purpose is to make money, and it does not consider whether the content pushed will damage or affect the immature minds of teenagers, making them anxious or impulsive.”