Learn languages naturally with fresh, real content!

tap to translate recording

Explore By Region

flag Microsoft blocks certain terms on Copilot AI tool after concerns about violent and sexual images generation.

Microsoft has blocked certain terms, including "pro-choice", "pro-life", and "four twenty", from its Copilot AI tool after an engineer raised concerns over the system generating violent and sexual images. The company has added a warning about multiple policy violations potentially leading to suspension. Microsoft is continuously adjusting its safety filters and controls to mitigate system misuse and strengthen its guardrails.

6 Articles

Further Reading