TikTok: The ‘deeply disturbing’ TikTok videos Australian teens see every 39 seconds

WARNING: disturbing content

A new study has found that Australian teens using TikTok are shown suicide-related content within minutes of logging into the app.

The study, conducted by the US-headquartered Center to Combat Digital Hate (CCHD), involved researchers who created two new TikTok accounts for users aged 13 in the US, UK, Australia and Canada.

Watch the latest news on channel 7 or stream for free on 7plus >>

One account was “standard” and the other was “vulnerable”, meaning the account created had a username containing the word “lose weight”.

This was to test the theory that users who search for content related to eating disorders often have usernames that contain related words.

For each account, researchers recorded the first 30 minutes of use by liking any videos related to body image, mental health, or eating disorders.

CCDH Executive Director Imran Ahmed called the results of the study “deeply disturbing”.

This TikTok video used the hashtag #imnothungry and promoted chewing gum as a weight loss aid. Credit: CCDH

New accounts were recommended videos about self-harm and eating disorders within minutes of opening the app and scrolling through the For You feed.

Suicide-related content was shown for 2.6 minutes, while eating disorder content was shown for eight minutes.

Overall, content related to mental health and body image was recommended every 39 seconds.

The four vulnerable accounts created were shown videos about eating disorders, self-harm and suicide three times more often than standard accounts.

The study mentions the death of 14-year-old Molly Russell, who committed suicide in 2017. This year, a coroner in the UK ruled that social media platforms contributed to her death.

Before her death, Molly created a Twitter account with the username “Idfc_nomore”, which was believed to mean “I don’t care anymore”.

In the six months leading up to her death, she liked, shared or saved over 2,000 posts related to suicide or depression.

“The study… showed that Instagram users with an eating disorder would choose usernames with related words such as “anorexia,” the study says.

“These users are more vulnerable to eating disorder content on social media because they are vulnerable to eating disorders and are actively looking for harmful content.”

The study found that content shown to vulnerable accounts was also more “extreme”.

Four vulnerable accounts created by CCDH were shown videos related to eating disorders, self-harm and suicide at three times the rate of standard accounts. Credit: CCDH

The videos featured other users talking about their plans for self-harm, as well as self-harm techniques.

“Results are every parent’s nightmare,” Ahmed said.

“Young people’s feeds are bombarded with harmful, heartbreaking content that can have a significant cumulative impact on their understanding of the world around them, as well as on their physical and mental health.”

As a result of the study, CCDH has put forward several recommendations for large technology companies.

The report says social media companies like TikTok lack transparency about an algorithm that “delivers toxic and harmful content to young and vulnerable users.”

CCDH recommended that TikTok take a proactive approach, such as implementing “built-in security features” such as responsive reporting systems.

“Introducing a requirement to implement a security framework will require companies like TikTok to conduct risk assessments of their products, policies and processes,” the study says.

“Our approach will mean that any changes to a product or service are carefully monitored and risks are appropriately managed, putting public safety first, not profit.”

With two-thirds of American teenagers using TikTok, Ahmed said, the app has revealed “a generational gap in usage and understanding.”

“Our results show not entertainment and safety, but a toxic environment for the smallest TikTok users, which is exacerbated for the most vulnerable,” he said.

“Without oversight, TikTok’s opaque algorithm will continue to profit by serving its users—children as young as 13, remember—increasingly intense and obnoxious content without validation, resources, or support.”

If you need help in a crisis, call Lifeline on 13 11 14. For more information about depression, contact BeyondBlue on 1300224636 or speak with your GP, local health care professional, or someone you trust.

See: Swimmer’s close encounter with a whale shark.

See: Swimmer’s close encounter with a whale shark.
Exit mobile version