In a significant effort to tackle the spread of harmful online content, Meta, Snap, and TikTok have joined forces to launch a new initiative called Thrive. The goal of this program is to help prevent the circulation of graphic material that promotes or encourages self-harm and suicide. Thrive will enable these major social media platforms to work together by sharing “signals” that alert each other when such content appears, ensuring a united front against its spread.
What is Thrive?
Thrive results from a partnership with the Mental Health Coalition, an organisation committed to removing the stigma surrounding mental health discussions. Meta plays a leading role in providing the technical backbone of Thrive, allowing these signals to be shared securely between the participating companies.
This new system is built on the same technology that powers Meta’s Lantern program, which is designed to combat child abuse online by allowing platforms to share cross-platform signals securely. Using hashed data, a unique code created from the violating content, these platforms can flag inappropriate material and warn others to take action, creating a streamlined response to prevent harmful content from spreading across multiple platforms.
A step toward safer online spaces
Meta has already taken significant steps to make content related to suicide and self-harm harder to find on its platform. However, the company is careful to maintain a space where users can share their personal stories about mental health, suicide, and self-harm as long as these stories don’t cross the line into promotion or provide graphic descriptions. This balance allows for open mental health discussion without encouraging harmful behaviour.
The Thrive initiative aims to strengthen these efforts further by ensuring that when harmful content appears on one platform, the others can be alerted immediately, making it harder for the material to reach a wider audience.
The numbers speak volumes
Meta’s data reveal the sheer volume of content that needs to be moderated. Each quarter, the platform takes action on millions of pieces of content related to suicide and self-harm. In the last quarter alone, around 25,000 posts were restored after users appealed Meta’s decision to remove them. This shows the complexity of managing this type of content, as not all posts violate policies, and some can be important in mental health discussions.
As social media plays a significant role in people’s lives, particularly for younger audiences, ensuring these platforms remain safe is crucial. Thrive represents a new level of cooperation between companies, highlighting the importance of a collective approach to tackling such serious issues. By acting quickly and sharing important signals, these companies reduce the chances of vulnerable users encountering harmful content.