Learn languages naturally with fresh, real content!

tap to translate recording

Explore By Region

flag Nearly 19% of U.S. teens aged 13–15 saw unwanted sexual content on Instagram, a 2025 Meta survey cited in a federal lawsuit reveals.

flag A 2025 court filing reveals that nearly 19% of Instagram users aged 13 to 15 in the U.S. reported seeing unwanted nudity or sexual images on the platform, according to a Meta survey cited in a federal lawsuit. flag The data, based on self-reported experiences, also found that 8% of teens saw self-harm or threats of self-harm. flag Instagram’s CEO acknowledged the limitations of such surveys and noted most explicit content is shared via private messages, complicating detection due to privacy protections. flag Meta says it removes nudity and explicit content, including AI-generated material, except for medical and educational uses. flag The findings fuel ongoing legal and public scrutiny over Meta’s role in youth mental health and platform safety.

20 Articles