It was just a matter of time before Grok Imagine slipped. Yesterday, we told you about Grok Imagine, the new creation of Xai, capable of making short Erotic and sexual videosfrom a simple imported image. Against the current of the giants of the sector which multiply the safeguards to avoid any inappropriate use of their tools, the tool of Elon Musk openly affirmed its will to let “l’imagination” From its users expressing without limits.
Despite the bright controversy, the tool reserved for premium users, however, provided some safeguards: it was not Not possible to generate explicit video from a celebrity photoassured the company. What was to happen happened, and a few hours just after Grok Imagine’s posting, the first naked Deepfakes representing known people began to flush.
Taylor Swift, (still) collateral victim
In a few seconds, reports the site of The Vergethe AI was capable of creating images of the singer Taylor Swift, even though no sexually explicit solicitation had been formulated. From a simple image prompt: “Taylor Swift celebrating Coachella with friends“, It was enough to launch the animation process in Spicy mode, then to see the damage. We then discover videos where the avatar of the naked singer begins to dance topless, in front of an audience generated by AI.
The situation is all the more worrying since Taylor Swift is a (unhappy) accustomed to this kind of drifts. Last year, the proliferation of pornographic deepfakes of the artist pushed the White House to deeply review the legislation around this type of IA content. This new example has an additional blow to the integrity of celebrities on the net, and as always, it is the women who suffer from it.
Very symbolic crazy guards
If the Spicy option does not necessarily induce total nudity – sometimes the AI is satisfied with suggestive gestures or partial stripping – it frequently leads to scenes without clothes, even without explicit request from the Internet user. There Age verification, sole restriction measure requestedcomes down to declaring his year of birth, without request for proof. Despite the recent adoption of laws intended to strengthen the fight against Revenge Porn and non -consented Deepfakes, the company seems to ignore its own conditions of use, which nevertheless prohibit the pornographic representation of identifiable people.
Only (skinny) consolation: beyond celebrities, the tool generates photorealist images of children, but refuses for the moment to animate the latter in a sexualized wayeven when selecting the Spicy mode.
In an increasingly tense context on the question of AI and its use, the urgency is all the greater since the virality of the service explodes: according to Elon Musk, more than 34 million images have already been generated in a few days. The ease of access and the speed of diversion form a toxic cocktail for targeted people. If she is the most media, Taylor Swift will probably not be the only victim of this tool.
🟣 To not miss any news on the Geek newspaper, subscribe to Google News and on our WhatsApp. And if you love us, we have a newsletter every morning.