Learn languages naturally with fresh, real content!

tap to translate recording

Explore By Region

flag Whistleblowers say TikTok and Meta boosted engagement over safety in 2026, spreading harmful content despite evidence of user harm.

flag Whistleblowers allege TikTok and Meta prioritized user engagement over safety in 2026, allowing harmful content like hate speech, violence, and extremism to spread, especially on Instagram Reels and TikTok’s algorithm-driven feeds. flag Internal reports show leadership pressured teams to relax content restrictions to compete, despite evidence of increased harm. flag Users, including teens, reported being radicalized by algorithmic recommendations, while safety teams were under-resourced and ignored. flag Both companies deny wrongdoing, citing new safety tools, but insiders say financial incentives still outweigh user wellbeing.

14 Articles