At least 37 attorneys general for US states are responding to the ongoing problem of the Grok chatbot creating child sexual abuse materials (CSAM).
It follows the app being banned in two countries and investigations being opened in both the UK and EU, but both Apple and Google have so far ignored requests for them to temporarily remove Grok and X from their respective app stores …
The Grok CSAM problem
Most AI chatbots these days are able to generate images from text prompts, and xAI’s Grok is no exception. It can do so either directly in the app, on the web, or through X.
What is exceptional, however, is Grok’s extremely loose guardrails. This has seen it being used to generate non-consensual effectively-nude images of real individuals, including children. One estimate said it had generated around 23,000 CSAM images in a period of just 11 days.
Two countries have blocked the app, with investigations live in both the UK and EU.
37 US states respond
Top comment by Haylo
This needs to be straightforward. Treat all app developers the same. They violated the App Store policies therefore they can't submit any new builds until this is addressed. If they don't address within a reasonable period, you pull the app from the store (but leave on users devices) then follow up with removal of the app permissions if this isn't resolved. Given the seriousness of the problem, these time periods need to be short. If they fix the issue then they are back in the store. Even the politics are easy to deal with. Who's gonna argue with "We needed to temporarily remove X due to CSAM violations. We welcome X back to the store once this issue is resolved" ?
Wired reports that at least 37 US states and territories are now calling on the company to address the problem following an open letter signed by most of them.
At least 37 attorneys general for US states and territories are taking action against xAI after people used its chatbot, Grok, to generate a flood of sexualized images earlier this year […] In addition to [the 35] who signed the letter, attorneys general from California and Florida tell WIRED they have also taken action.
The letter calls for xAI to take six steps:
- take all necessary measures to ensure that Grok is no longer capable of producing non-consensual intimate images (including nonconsensual images that fall short of depicting full nudity or graphic sexual conduct but depict people in bikinis, underwear, revealing clothing, or suggestive poses) and child sexual abuse material;
- eliminate such content that has already been produced;
- suspend users that have created these materials;
- where applicable, report these creators and users to the relevant authorities;
- grant X users control over whether their content can be edited by Grok, including at a minimum the ability to easily prohibit the @Grok account from responding to their posts or editing their images when prompted by another user; and
- ensure that the safeguards you recently announced do not merely place NCII creation behind a paywall, but actually mitigate its production throughout X and the Grok platform.
Apple and Google need to act
Earlier this month, three US senators asked Apple CEO Tim Cook to temporarily remove both X and Grok from the App Store due to “sickening content generation.” The company has not yet done so.
It’s now abundantly clear that xAI is not going to take any meaningful action until its hand is forced. By far the most effective way to apply pressure is for Apple and Google to temporarily remove both Grok and X apps from their respective app stores.
- Official Apple Store on Amazon
- AirTag holders and accessories
- Mac Pro-style Mac mini casing
- Wireless CarPlay adapter
- NordVPN – privacy-first VPN with no logs and independent audits to verify
- Official iPhone cases: iPhone 17 | iPhone 17 Pro and Pro Max | iPhone Air
FTC: We use income earning auto affiliate links. More.

Comments