Meta’s Head of Instagram, Adam Mosseri, has recently acknowledged several mistakes in the moderation of Threads, the platform linked to Instagram. Users, including myself, have faced unexpected account deletions and post removals without explanation. One common example is Meta mistakenly deleting my account this week, believing I was underage, while a colleague’s account was locked after she jokingly mentioned dying in a heatwave. Many others have reported posts disappearing without clear reasons, causing frustration and confusion.
Over the past week, “Threads Moderation Failures” has been trending online, reflecting mounting pressure on Meta to address these ongoing issues with its moderation processes. In response, Mosseri admitted that problems exist and shared an update on Threads acknowledging the situation.
Tool failure behind moderation issues
According to Mosseri, a malfunctioning “tool” contributed to the moderation failures, affecting the decision-making process. This tool, used by Meta’s human reviewers, reportedly failed to provide enough context for them to assess flagged content properly. While many believe that Meta relies heavily on artificial intelligence to handle moderation automatically, Mosseri clarified that human reviewers are still responsible for these decisions. The AI flags potential violations for further review, but human moderators’ ultimate decision-making power lies.
View on Threads
Mosseri stressed the importance of ensuring reviewers have sufficient information to make fair decisions and revealed that Meta is working to resolve these issues. Fixes are already being implemented to help moderators make more informed calls in the future, with a strong commitment to improving the system. “We need to do better,” Mosseri admitted, expressing Meta’s intention to refine its moderation tools and processes.
Restoring accounts but leaving questions unanswered
Meta has yet to provide a clear explanation for why posts and accounts are being deleted or locked. For many affected users, the appeal process has proven to be mentally draining and unnecessarily complex.
Many users still call for increased transparency and accountability from Meta in how it handles content moderation, particularly on platforms like Threads, where users expect more seamless interactions.
While Meta’s steps toward fixing the broken tool are promising, many hope this is the beginning of a larger conversation around improving moderation and user experience. Only time will tell if these efforts will restore confidence in the platform’s moderation practices.