Microsoft blocks certain terms on Copilot AI tool after concerns about violent and sexual images generation.
Microsoft has blocked certain terms, including "pro-choice", "pro-life", and "four twenty", from its Copilot AI tool after an engineer raised concerns over the system generating violent and sexual images. The company has added a warning about multiple policy violations potentially leading to suspension. Microsoft is continuously adjusting its safety filters and controls to mitigate system misuse and strengthen its guardrails.
13 months ago
6 Articles
Articles
Further Reading
You have 13 free stories remaining this month. Subscribe anytime for unlimited access.