Popular
join now

Microsoft fixes vulnerability, its AI can no longer generate fake nude photos of celebrities

AI information8months agorenew AItools
553 0
Microsoft fixes vulnerability where its AI can no longer generate fake nude photos of celebrities

On Monday U.S. time, Microsoft introduced more protections for Designer, its artificial intelligence text-to-image generation tool, prohibiting people from using it to create non-consensual fake photos of celebrities. The AI-generated nude photos of Taylor Swift that went viral on X last week reportedly came from 4chan and Telegram channels, forcing Microsoft to make the change.

A Microsoft spokesperson confirmed the incident and said, "Any violation of our policies, especially repeated attempts to produce this type of content, will result in users not being able to continue using our services. We have a large team working to develop various security systems based on our responsible AI principles, including content filtering, operational monitoring and abuse detection, to reduce the potential for abuse of our systems and create a safer environment for our users."

Microsoft fixes vulnerability where its AI can no longer generate fake nude photos of celebrities

On Friday, Microsoft CEO Satya Nadella said in an interview that 'we have a responsibility to add more guardrails to AI tools to prevent them from producing harmful content'. Over the weekend, X began completely blocking searches for Taylor Swift.

© Copyright Notice

related articles

en_USEnglish