X Moves to Restrict Grok After Global Backlash Over Sexualized AI Images
Measures include geoblocking, tighter content limits, and regulatory scrutiny across the US, EU, and Asia.
ERBIL (Kurdistan24) — Elon Musk’s social media platform X announced on Wednesday a series of restrictions aimed at preventing its AI chatbot, Grok, from generating or editing images that sexually exploit real people, following mounting international outrage and regulatory action over the creation of explicit and nonconsensual imagery, including of women and children.
In a statement, X said it would “geoblock the ability” of Grok and X users to create images of people in “bikinis, underwear, and similar attire” in jurisdictions where such content is illegal.
The platform added that it has implemented technological safeguards to prevent Grok from editing images of real individuals into revealing clothing. “This restriction applies to all users, including paid subscribers,” X’s safety team said.
As an additional measure, the company said image creation and photo-editing functions via Grok would now be available only to paid subscribers, describing the move as an “extra layer of protection.”
The announcement follows the launch of an investigation by California Attorney General Rob Bonta into xAI, the company behind Grok, over the production and dissemination of sexually explicit material.
The probe comes amid accusations that Grok’s features were used to harass individuals online through nonconsensual sexualized images.
“The avalanche of reports detailing the non-consensual, sexually explicit material that xAI has produced and posted online in recent weeks is shocking,” Bonta said, adding that California has “zero tolerance” for the creation or spread of nonconsensual intimate images or child sexual abuse material.
California Governor Gavin Newsom also weighed in, calling xAI’s decision to allow such content “vile” and saying it prompted him to urge the attorney general to hold the company accountable.
Pressure has also intensified in Europe. The European Commission, acting as the EU’s digital watchdog, said it had taken note of “additional measures X is taking to ban Grok from generating sexualized images of women and children.”
Commission spokesperson Thomas Regnier said the changes would be carefully assessed to ensure they effectively protect EU citizens, following sharp criticism over the generation of nonconsensual undressed images.
Global scrutiny of Grok escalated after its so-called “Spicy Mode” enabled users to create sexualized deepfakes using simple text prompts, such as requests to alter clothing or remove it entirely. Critics say the feature facilitated widespread abuse.
Beyond the United States and Europe, several countries have taken direct action. Indonesia became the first nation to block access to Grok entirely on Saturday, with neighboring Malaysia following suit on Sunday.
India said X had removed thousands of posts and hundreds of user accounts in response to official complaints. In Britain, media regulator Ofcom announced on Monday that it had opened an investigation into whether X failed to comply with UK law regarding sexual imagery.
France has also intervened, with Commissioner for Children Sarah El Hairy referring Grok-generated images to French prosecutors, the Arcom media regulator, and EU authorities.
Further increasing pressure on Musk’s companies, a coalition of 28 civil society organizations submitted open letters on Wednesday to the CEOs of Apple and Google, urging them to remove Grok and X from their app stores over the surge in sexualized AI-generated images.
Concerns have been reinforced by independent research. Last week, Paris-based nonprofit AI Forensics published an analysis of more than 20,000 Grok-generated images, finding that over half depicted individuals in minimal attire—most of them women—with around two percent appearing to be minors.
As investigations and regulatory actions continue to spread, X and xAI face growing demands to demonstrate that their new safeguards are sufficient to curb AI-enabled sexual exploitation and comply with national and international laws.