Learn languages naturally with fresh, real content!

Popular Topics
Explore By Region
Whistleblowers say TikTok and Meta boosted engagement over safety in 2026, spreading harmful content despite evidence of user harm.
Whistleblowers allege TikTok and Meta prioritized user engagement over safety in 2026, allowing harmful content like hate speech, violence, and extremism to spread, especially on Instagram Reels and TikTok’s algorithm-driven feeds.
Internal reports show leadership pressured teams to relax content restrictions to compete, despite evidence of increased harm.
Users, including teens, reported being radicalized by algorithmic recommendations, while safety teams were under-resourced and ignored.
Both companies deny wrongdoing, citing new safety tools, but insiders say financial incentives still outweigh user wellbeing.
14 Articles
Los denunciantes dicen que TikTok y Meta impulsaron la participación sobre la seguridad en 2026, difundiendo contenido dañino a pesar de la evidencia de daño al usuario.