TikTok and Meta team up on program to discourage harmful content
By Meg Dowell
There has been a rise in the prevalence of content on social media that promotes and encourages self-harm -- and companies like Meta, TikTok, and Snap are taking steps to stop it.
Thrive, in partnership with the Mental Health Coalition, is a program that allows companies to share data about content that violates certain policies around content that can endanger viewers. The ultimate goal is to better be able to identify and suppress content that might encourage viewers to hurt themselves. Anything that could lead someone to engage in such behaviors should not be widely accessible, especially to younger viewers. It's not a perfect solution, but with so much content being uploaded to these platforms every day, it's an important starting point to acknowledge.
This all does not mean users on these platforms are not allowed to talk about their mental health or share stories about their experiences. In fact, the Mental Health Coalition is built on the premise of removing stigma from conversations surrounding mental health. This content is absolutely allowed -- as long as it isn't too graphic, and it does not encourage viewers to engage in harmful behaviors.
There is only so much these tech companies can do to protect their users, but it genuinely seems like they are trying to do what they can to discourage content that could lead someone to engage in behaviors that are harmful to them. Algorithms are tricky, and so often content that has no ill intent is unintentionally flagged as inappropriate. But it's a step in the right direction, and will hopefully have the desired effect of protecting some people from danger.
If you or someone you know is considering suicide or is anxious, depressed, upset, or needs to talk, contact the Crisis Text Line from anywhere in the US.