X limits sexualized AI deepfakes to paying customers
The move comes after the United Kingdom threatened to take government action against the social media platform.
Photo by Matt Cardy/Getty Images
This hasn’t been a particularly calm or stable week, politically speaking, so you’d be forgiven for missing the latest saga playing out on X (née Twitter). Scrutiny over the everything app’s in-house AI program, Grok, reached something of a tipping point in these first few days of 2026. The AI image generator has been able to generate sexualized, nearly-nude images of celebrities for months and been able to “remove” the clothes from images uploaded to the server. Obviously (but no less distressingly) this has led to X users widely sharing “declothed” images of minors and others in vulnerable positions. (Social media reporter Kat Tenbarge reported seeing the dead body of Renee Nicole Good, the woman killed by ICE in Minnesota, edited into a bikini within a day of her killing.) Various governments and observers have condemned this, and X has finally done something about it, kind of.