Microsoft has introduced more protections to its AI text-to-image generation tool Designer that users were utilizing to create nonconsensual sexual images of celebrities.

The changes come after AI-generated nude images of American singer-songwriter Taylor Swift that went viral on X last week came from 4chan and a Telegram channel where people were using Designer to create AI-generated images of celebrities, reports 404 Media.

“We are investigating these reports and are taking appropriate action to address them,” a Microsoft spokesperson was quoted as saying.

Buy Me A Coffee

“Our Code of Conduct prohibits the use of our tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service. We have large teams working on the development of guardrails and other safety systems in line with our responsible AI principles,” it added.

Microsoft stated that an ongoing investigation was unable to confirm whether the images of Swift on X were created using Designer. However, the company is continuing to strengthen its text filtering prompts and address the misuse of its services, the report mentioned.

READ
Samsung Releases Its New Entry-Level Smartwatch ‘Galaxy Watch FE’