




Grok’s ‘Spicy’ Mode Makes NSFW Celebrity Deepfakes of Women (But Not Men)

This week, Elon Musk officially launched Grok Imagine, xAI’s image and video generator for iOS, for people who subscribe to SuperGrok and Premium+ X. The app allows users to create NSFW content with its “Spicy” mode, and The Verge reported on Tuesday that users are able to create topless videos of Taylor Swift easily—without even asking for it. But it’s not just Swift who should be concerned about Musk’s new AI-generated softcore porn tool.
Gizmodo created about two dozen videos of politicians, celebrities, and tech figures using the Grok Spicy mode, though some were blurred out or came back with a message reading “video moderated.” When Grok did make scandalous images, it would only make the ones depicting women truly not-safe-for-work. Videos of men were the kind of thing that wouldn’t really raise many eyebrows.
X has been swamped over the past two days with AI-generated images of naked women and tips on how to achieve the most nudity. But users, who’ve created tens of millions of Grok Imagine images according to Musk, don’t even need to go to some great effort to get deepfakes of naked celebrities. Gizmodo didn’t explicitly ask for nudity in the examples we cite in this article, but we still got plenty of it. All we did was click on the Spicy button, which is one of four options, along with Custom, Fun, and Normal.
Gizmodo tested Grok Imagine by generating videos of not just Taylor Swift, but other prominent women like Melania Trump and historical figures like Martha Washington. Melania Trump has been a vocal supporter of the Take It Down Act, which makes it illegal to publish non-consensual “intimate imagery,” including deepfakes.
Grok also created a not-safe-for-work video of the late feminist writer Valerie Solanas, author of 1967’s S.C.U.M Manifesto. Almost all of the videos depicted the women that we tested as shedding clothes to make them naked from the waist up, though the video of Solanas was unique in that it did show her completely naked.
What happens when you try to generate Spicy videos of men? The AI will have the male figure take off his shirt, but there’s nothing more scandalous than that. When Gizmodo figured out that it would only remove a man’s shirt, we prompted the AI to create a shirtless image of Elon Musk and see what it might do with that. The result was the extremely ridiculous (and safe-for-work) video you see below.
https://www.youtube.com/watch?v=UnisjOElNbI
Attempts to make videos of Mark Zuckerberg, Jeff Bezos, Joaquin Phoenix, and Charlie Chaplin, as well as Presidents Barack Obama, Bill Clinton, and George Washington, ran into the same limitation. The AI-generated videos will have the men take their shirts off most of the time, but there’s nothing beyond that. And if there is anything more, it’s usually so cringe that we’d worry about users dying from secondhand embarrassment. Making a Spicy video of Errol Musk, Elon’s father, produced the same thing. He just took off his shirt.
When we made a generic man to see if Spicy mode would be more loose with its sexual content since it wasn’t a known public figure, it still just made a bizarre, awkward video of a man tugging at his pants. The pants, it should be noted, seem to be a combination of shorts for one leg and long jeans for the other before transforming into just shorts. The audio for each video was also auto-generated without any further instruction.
https://www.youtube.com/watch?v=EK6GY7h2oX4
(1/2)