back to top
Thursday. 19 September 2024

Social media giants unite to combat harmful suicide and self-harm content

Meta, Snap, and TikTok unite in the Thrive initiative to stop the spread of self-harm and suicide content through shared alerts.

Published:

Published:

Trending Stories

- Advertisement -

In a significant effort to tackle the spread of harmful online content, Meta, Snap, and TikTok have joined forces to launch a new initiative called Thrive. The goal of this program is to help prevent the circulation of graphic material that promotes or encourages self-harm and suicide. Thrive will enable these major platforms to work together by sharing “signals” that alert each other when such content appears, ensuring a united front against its spread.

What is Thrive?

Thrive results from a with the Mental Health Coalition, an organisation committed to removing the stigma surrounding mental health discussions. Meta plays a leading role in providing the technical backbone of Thrive, allowing these signals to be shared securely between the participating companies.

This new system is built on the same technology that powers Meta's Lantern program, which is designed to combat child abuse online by allowing platforms to share cross-platform signals securely. Using hashed data, a unique code created from the violating content, these platforms can flag inappropriate material and warn others to take action, creating a streamlined response to prevent harmful content from spreading across multiple platforms.

A step toward safer online spaces

Meta has already taken significant steps to make content related to suicide and self-harm harder to find on its platform. However, the company is careful to maintain a space where users can share their personal stories about mental health, suicide, and self-harm as long as these stories don't cross the line into promotion or provide graphic descriptions. This balance allows for open mental health discussion without encouraging harmful behaviour.

The Thrive initiative aims to strengthen these efforts further by ensuring that when harmful content appears on one platform, the others can be alerted immediately, making it harder for the material to reach a wider audience.

The numbers speak volumes

Meta's data reveal the sheer volume of content that needs to be moderated. Each quarter, the platform takes action on millions of pieces of content related to suicide and self-harm. In the last quarter alone, around 25,000 posts were restored after users appealed Meta's decision to remove them. This shows the complexity of managing this type of content, as not all posts violate policies, and some can be important in mental health discussions.

As social media plays a significant role in people's lives, particularly for younger audiences, ensuring these platforms remain safe is crucial. Thrive represents a new level of cooperation between companies, highlighting the importance of a collective approach to tackling such serious issues. By acting quickly and sharing important signals, these companies reduce the chances of vulnerable users encountering harmful content.

Tech Edition has partnerships that involve sponsored content. While this financial support helps us with daily operations, it doesn't affect the integrity of our reviews. We remain committed to delivering honest and insightful content to our readers.

Tech Edition is now on Telegram! Join our channel here and catch all the latest tech news!



Emma Job
Emma Job
Emma is a news editor at Tech Edition. With a decade's experience in content writing, she revels in both crafting and immersing herself in narratives. From tracking down viral trends to delving into the most recent news stories, her goal is to deliver insightful and timely content to her readers.

Featured Content

LG TONE Free T90S review: A Dolby Atmos wireless earbuds

LG TONE Free T90S offers advanced audio, Dolby Head Tracking, and UVnano technology for a premium, hygienic listening experience with custom controls.

Related Stories