Skip to main content

Two countries block Grok app over AI-generated CSAM as we await Apple’s response

Two countries have blocked the Grok app after it was widely used to generate non-consensual near-nude deepfakes of women and children. A third country is currently carrying out an investigation.

Three US senators have asked Apple to temporarily remove both X and Grok from the US App Store due to “sickening content generation,” and we are still awaiting the company’s response …

AI-generated CSAM by Grok

The Grok AI tool is available both as a standalone app and through the X app. It is also available through the Grok tab on the X website.

There has been abundant evidence of Grok generating non-consensual, near-nude deepfakes of real individuals, taking a clothed photo and then digitally removing clothing to replace it with a bikini or other revealing clothing. Even more worryingly, some of these deepfakes were of children.

While nude imagery is theoretically blocked by Grok, some users have been using prompt language that works around this.

On Friday, three U.S. senators asked Apple to temporarily remove both apps from the App Store, noting that the non-consensual imagery included child sexual abuse materials (CSAM).

Senators Ron Wyden, Ed Markey, and Ben Ray Luján penned an open letter to the CEOs of Apple and Google, asking both companies to pull X and Grok apps “pending a full investigation” of “mass generation of nonconsensual sexualized images of women and children.”

The letter notes that ex-CEO Elon Musk has failed to act, and contrasts the lack of action by Apple and Google with their rapid removal of the ICEBlock app at the request of the White House. Musk’s only response has been to limit X image generation to paid subscribers, which seems the most cynical possible action, but the same feature is accessible to anyone through the Grok tab on both the X website and app.

Two countries block the Grok app

Top comment by Timmy

Liked by 9 people
this post violated our policy
View all comments

Associated Press reports that Malaysia and Indonesia have blocked the app within their countries.

Regulators in the two Southeast Asian nations said existing controls were not preventing the creation and spread of fake pornographic content, particularly involving women and minors. Indonesia’s government temporarily blocked access to Grok on Saturday, followed by Malaysia on Sunday.

Britain’s media regulator Ofcom has also opened a formal investigation.

There have been deeply concerning reports of the Grok AI chatbot account on X being used to create and share undressed images of people – which may amount to intimate image abuse or pornography – and sexualised images of children that may amount to child sexual abuse material […] Ofcom has decided to open a formal investigation to establish whether X has failed to comply with its legal obligations under the Online Safety Act.

Apple and Google have not yet responded

As of the time of writing, there has been no public response from either Apple or Google, and the apps remain available on their respective US app stores. We’ve reached out to Apple for comment and will update with any response.

Photo: Melanie Wasser/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear