TikTok launches mental health guide for community's well-being

TikTok App

TikTok App

Popular video-sharing social networking service, TikTok shared a handful of new features on Tuesday designed to support users’ mental well-being, including guides on how to engage with people who may be struggling and updated warning labels for sensitive content.

The changes come as Facebook’s research into its photo-sharing app Instagram, which last year launched TikTok competitor Reels, has reportedly raised concerns about Instagram’s impact on the mental health of teens.

“While we don’t allow content that promotes, glorifies or normalizes suicide, self-harm or eating disorders,” TikTok said in a blog post.

“We do support people who choose to share their experiences to raise awareness, help others who might be struggling and find support among our community.”

To more safely support these conversations and connections, TikTok is rolling out new well-being guides to help people sharing their personal experiences on the video app.

The guides were developed along with the International Association for Suicide Prevention, Crisis Text Line, Live for Tomorrow, Samaritans of Singapore and Samaritans (UK), and they’re available on TikTok’s Safety Center.

Related News

The social video app is also sharing a new Safety Center guide for teens, educators and caregivers about eating disorders. The guide was developed along with experts like the National Eating Disorders Association, National Eating Disorder Information Centre, Butterfly Foundation and Bodywhys, and offers information, support and advice. Earlier this year, TikTok added a feature that directs users searching for terms related to eating disorders to appropriate resources.

In addition, when someone searches for words or phrases like #suicide, they’re pointed to local support resources like the Crisis Text Line helpline to find information on treatment options and support.

TikTok also said it’s updating its warning label for sensitive content, so that when a user searches for terms that could surface distressing content, such as “scary makeup,” the search results page will show an opt-in viewing screen. Users can tap “Show results” to view the content.

The site is also showcasing content from creators sharing their personal experiences with mental well-being, information on where to get help and advice on how to talk to loved ones.

“These videos will appear in search results for certain terms related to suicide or self-harm, with our community able to opt-in to view should they wish to,” TikTok said.