Introducing Endless Mode: A New Games & Anime Site from Paste
On Wednesday, Elon Musk announced to followers that Grok Imagine video generation would be “free to all US users” for “a few days.” The generative artificial intelligence tool allows users to create uncanny valley-esque images based on text prompts, as well as briefly animate those images. Grok Imagine offers a few preset modes for these animations: “Custom,” “Normal,” “Fun,” and “Spicy.” If you’re wondering why the hell “Spicy” would even be an option, well, just look at Musk and his devoted legion of right-wing, incel-adjacent dorks. Much of the Grok Imagine content Musk has shared or reposted are clips of generic-looking buxom blondes or women in revealing fantasy garb. Many of Musk’s competitors have implicitly or explicitly stated goals for their AI products to replace human connection. Grok’s “Spicy” mode is Musk’s answer to this challenge, or at least an attempt to cut in on PornHub’s profits.
If Grok users were just making up “Spicy” women from scratch, that would be disturbing enough. Except according to numerous reports, Grok will make facsimiles of celebrities and has no compunction about depicting those celebrities nude—a stark difference from Google’s Veo and OpenAI’s Sora, which have “safeguards in place to prevent users from creating NSFW content and celebrity deepfakes,” per The Verge. The Verge tested this Grok feature by generating images of Taylor Swift at Coachella and then putting those images into “Spicy” mode. Without specifically prompting the AI for nudity, the “Spicy” mode “had Swift tear off her clothes and begin dancing in a thong for a largely indifferent AI-generated crowd.”
Other outlets have done similar experiments, with Deadline finding it was all too easy to get Grok to generate an image of Scarlett Johansson revealing her underwear on a red carpet. (Johansson has been a vocal critic of AI, particularly regarding this kind of use of a person’s likeness.) Though these deepfakes aren’t yet good enough for anyone to truly mistake the images for the actual celebrities, they’re still recognizable enough to be incredibly troubling—especially with Musk on X promising that the tech is only going to improve from here. Disturbingly, Gizmodo found in its experiments that while “Spicy” mode would show celebrity men shirtless, “it would only make the ones depicting women truly not-safe-for-work. Videos of men were the kind of thing that wouldn’t really raise many eyebrows.”
If the likenesses of celebrity women aren’t safe from this kind of exploitation, there’s no stopping creeps from feeding the images of non-famous women into the deepfake nude machine. (Gizmodo‘s experiment found that prompts for generic women would go topless or reveal underwear, but generic men would still only ever take their shirts off.) This kind of manipulation is already one of the great dangers of the Internet, but Musk is making it a whole lot easier by explicitly encouraging his Grok users to make “Spicy” content. If only someone with a lot of money, a vested interest in protecting their image, and the desire to make the world safer for other women and girls would take Elon’s ass to court over this. Taylor Swift, it’s time to be a hero!