Tech

Microsoft Tweaks AI Tool After Explicit Taylor Swift Fakes

SWIFT CRACKDOWN

The tool has been used by online groups to generate nude photos of celebrities.

A photo of Taylor Swift
Mario Anzuoni/Reuters

Microsoft beefed up protections for victims of nonconsensual explicit images generated by its text-to-image tool Designer, after AI-generated naked photos of Taylor Swift went viral last week. 404 Media first reported that people on 4chan and a specific Telegram channel were using Designer to create nonconsensual NSFW images of celebrities and sharing them. According to the report, the users were able to skirt some of Designer’s anti-porn generation protections by slightly misspelling celebrity names or by using suggestive phrasing that is not explicitly sexual. These tactics no longer work. Microsoft committed to fixing its AI tools to prevent their harmful use in an email to 404, although it said it could not yet confirm whether the Taylor Swift images were made on Designer.

Read it at 404 Media