Click here for updates on this story
Toronto, Ontario (CTV Network) — Social media platform TikTok has both positive and negative impacts on users’ mental health, new research has shown.
A study by the University of Minnesota says the platform provided a sense of community and self-discovery but also “repeatedly” exposed users to harmful content.
The team of researchers set out to understand why mental health content thrives on TikTok and how its content can impact people struggling with their mental health.
The first-of-its-kind study determined TikTok, with its unique algorithm, served as both a “haven” and “hindrance” for users.
Researchers conducted 16 interviews with people aged 16 to 54 who engaged with mental health content on the platform. Interviews lasted between 60 to 90 minutes and participants were recruited by social media.
“They think of it as the dance platform or the place where everybody gets an ADHD diagnosis,” Stevie Chancellor, senior author of the paper and an assistant professor at the University of Minnesota’s Department of Computer Science and Engineering, said in a press release. “But, people should also be mindful of its algorithm, how it works, and when the system is providing them things that are harmful to their wellbeing.”
TikTok is run by a recommender system displayed on a “For You Page” instead of showing users posts from accounts they follow, a press release from the researchers says.
While this personalized approach is helpful for some users to avoid feelings of loneliness, it can also lead other users down a “rabbit hole” researchers said.
“TikTok is a huge platform for mental health content,” Ashlee Milton, first author of the paper and a University of Minnesota computer science and engineering Ph.D. student, said in a press release. “A lot of our participants talked about how helpful this mental health information was. But, at some point, because of the way the feed works, it’s just going to keep giving you more and more of the same content. And that’s when it can go from being helpful to being distressing and triggering.”
Despite a “not interested” button available on TikTok, participants said their feeds still recommended negative mental health content, leading some to take breaks or quit the platform due to the distressing videos they were being served.
Some participants expressed difficulty in understanding TikTok creators’ intent, saying some could be posting for mental health awareness or chasing followers and likes.
“One of our participants jokingly referred to the For You page as ‘a dopamine slot machine,'” Milton said. “They talked about how they would keep scrolling just so that they could get to a good post because they didn’t want to end on a bad post.”
Researchers said although negative experiences happen through the platform, it is useful for people to understand why videos are recommended.
Although TikTok has never publicly revealed the inner workings of its algorithm, experts say content that gets the most engagement and that users have previously shown interest in are likely to be in a person’s feed.
“Ashlee and I are interested in how platforms may promote harmful behaviours to a person so that eventually, we can design strategies to mitigate those bad outcomes,” Chancellor said.
Please note: This content carries a strict local market embargo. If you share the same market as the contributor of this article, you may not use it on any platform.