1. Social media platforms struggle to effectively detect and remove content related to suicide and self-harm, according to the Molly Rose Foundation.

The Molly Rose Foundation reports that major social media platforms, including Instagram, Facebook, Pinterest, Snapchat, TikTok, and X (formerly Twitter), are failing to effectively detect and remove content related to suicide and self-harm, with inconsistent, uneven, and inadequate responses to such content. The foundation, which was established in response to the suicide of its founder's daughter after viewing harmful content, has called for the UK government to strengthen the Online Safety Act and urged tech companies to do more to protect children from harmful content. Meta, Snapchat, and TikTok have responded, stating that they have measures in place to combat harmful content, but did not directly address the accusations of failing to detect and remove it.

August 14, 2024
26 Articles

Further Reading