Elon Musk’s X to block Grok from undressing images of real people
Elon Musk’s AI model Grok will no longer be able to edit photos of real people to show them in revealing clothing, after widespread concern over sexualised AI deepfakes in countries including the UK and US.
“We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis.
“This restriction applies to all users, including paid subscribers,” reads an announcement on X, which operates the Grok AI tool.
The change was announced hours after California’s top prosecutor said the state was probing the spread of sexualised AI deepfakes, including of children, generated by the AI model.
In Wednesday’s statement, California Attorney General Rob Bonta said: “This material, which depicts women and children in nude and sexually explicit situations, has been used to harass people across the internet.”
Malaysia and Indonesia have blocked access to the chatbot over the images and UK Prime Minister Sir Keir Starmer warned X could lose the “right to self regulate” amid outrage over the AI images.
Comments are closed.