TikTok pushes potentially harmful content to users as often as every 39 seconds, study says

Suing Social Media: Families say social media algorithms put their kids in danger | 60 Minutes

TikTok recommends self-harm and eating disorder content to some users within minutes of joining the platform, according to a new report published Wednesday by the Center for Countering Digital Hate (CCDH).

The new study had researchers set up TikTok accounts posing as 13-year-old users interested in content about body image and mental health. It found that within as few as 2.6 minutes after joining the app, TikTok's algorithm recommended suicidal content. The report showed that eating disorder content was recommended within as few as 8 minutes.

Over the course of this study, researchers found 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views.

"The new report by the Center for Countering Digital Hate underscores why it is way past time for TikTok to take steps to address the platform's dangerous algorithmic amplification," said James P. Steyer, Founder and CEO of Common Sense Media, which is unaffiliated with the study. "TikTok's algorithm is bombarding teens with harmful content that promote suicide, eating disorders, and body image issues that is fueling the teens' mental health crisis."

TikTok, which was globally launched by Chinese company ByteDance in 2017, and operates through algorithms informed by personal data — likes, follows, watch-time, interests of a user — has become the world's fastest growing social media app, reaching a billion active monthly users by 2021

The CCDH report details how TikTok's algorithms refine the videos shown to users as the app gathers more information about their preferences and interests. The algorithmic suggestions on the "For You" feed are designed, as the app puts it, to be "central to the TikTok experience." But new research shows that the video platform can push harmful content to vulnerable users as it seeks to keep them interested.

To test the algorithm, CCDH researchers registered as users in the United States, United Kingdom, Canada, and Australia and created "standard" and "vulnerable" accounts on TikTok.  There was a total of eight accounts created, and data was gathered from each account for the first 30 minutes of use. The CCDH says the small recording window was done to show how quickly the video platform can understand each user and push out potentially harmful content.

In the report, each researcher, posing as a 13-year-old, the minimum age TikTok allows to sign up for its service, made two accounts in their designated country. One account was given a female username. The other, a username that indicates a concern about body image—the name included the phrase "loseweight."  Across all accounts, the researchers paused briefly on videos about body image and mental health. They "liked" those videos, as if they were teens interested in that content.

When the "loseweight" account was compared with the standard, the researchers found that "loseweight" accounts were served three times more overall harmful content, and 12 times more self-harm and suicide specific videos than the standard accounts.

"TikTok is able to recognize user vulnerability and seeks to exploit it," said Imran Ahmed CEO of CCDH, who is in Washington D.C., advocating for the Kids Online Safety Act (KOSA), which would put guardrails in place to protect minors online. "It's part of what makes TikTok's algorithms so insidious; the app is constantly testing the psychology of our children and adapting to keep them online."

Content pushed to the vulnerable accounts included a video with the caption: "Making everyone think your [sic] fine so that you can attempt in private".

The video insinuating attempting suicide amassed 386,900 likes. The report also featured a video of a teen girl crying, with words on the screen reading: "You're not thinking of taking your life right?" And then making reference to a TV character named, Sarah Lynn, who in the Netflix animated series "Bojack Horseman" dies from an overdose. That video received 327,900 likes. And another with a link to PrettyScale.com, a website where users upload pictures of the body and face to have their attractiveness ranked by an 'mathematical formula.' The video had 17,300 likes.

Reached for comment, a TikTok spokesperson challenged the methodology of the study.

"We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need," the representative said.

The TikTok spokesperson went on to say the video platform was "mindful that triggering content is unique to each individual" and that the social platform "remain[s] focused on fostering a safe and comfortable space for everyone."

As 60 Minutes reported Sunday, this study comes as more than 1,200 families are pursuing lawsuits against social media companies including TikTok. These suits allege that content on social media platforms profoundly impacted the mental health of their children, and in some cases, helped lead to the death of their children. More than 150 lawsuits are expected to move forward next year.

If you or someone you know is in emotional distress or suicidal crisis, call the National Suicide Prevention Hotline at 1-800-273-TALK (8255).

For more information about mental health care resources and support, The National Alliance on Mental Illness (NAMI) HelpLine can be reached Monday through Friday, 10 a.m.–6 p.m. ET, at 1-800-950-NAMI (6264) or email info@nami.org.

Read more
f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.