In short
- Grok AI produced an approximated 23,000+ sexualized pictures of kids over 11 days from December into January.
- Numerous nations have actually prohibited Grok, while the UK, EU, France, and Australia introduced examinations into prospective infractions of kid security laws.
- In spite of Elon Musk’s rejections and brand-new constraints, about one-third of the bothersome images stayed on X since mid-January.
Elon Musk’s AI chatbot Grok produced an approximated 23,338 sexualized images illustrating kids over an 11-day duration, according to a report launched Thursday by the Center for Countering Digital Hate.
The figure, CCDH argues, represents one sexualized picture of a kid every 41 seconds in between December 29 and January 9, when Grok’s image-editing functions enabled users to control images of genuine individuals to include exposing clothes and sexually suggestive postures.
The CCDH likewise reported that Grok produced almost 10,000 animations including sexualized kids, based upon its examined information.
The analysis approximated that Grok produced around 3 million sexualized images amount to throughout that duration. The research study, based upon a random sample of 20,000 images from 4.6 million produced by Grok, discovered that 65% of the images consisted of sexualized material illustrating guys, ladies, or kids.
” What we discovered was clear and troubling: Because duration Grok ended up being an industrial-scale maker for the production of sexual assault product,” Imran Ahmed, CCDH’s president informed The Guardian
Grok’s short pivot into AI-generated sexual pictures of kids has actually set off a worldwide regulative reaction. The Philippines ended up being the 3rd nation to prohibit Grok on January 15, following Indonesia and Malaysia in the days prior. All 3 Southeast Asian countries mentioned failures to avoid the production and spread of non-consensual sexual material including minors.
In the UK, media regulator Ofcom introduced an official examination on January 12 into whether X broke the Online Security Act. The European Commission stated it was “really seriously checking out” the matter, considering those images as unlawful under the Digital Provider Act. The Paris district attorney’s workplace broadened a continuous examination into X to consist of allegations of creating and sharing kid porn, and Australia began its own examination too.
Elon Musk’s xAI, which owns both Grok and X– previously Twitter, where much of the sexualized images were instantly published)– at first reacted to media queries with a three-word declaration: “Tradition Media Lies.”
As the reaction grew, the business later on carried out constraints, initially restricting image generation to paid customers on January 9, then including technical barriers to avoid users from digitally undressing individuals on January 14. xAI revealed it would geoblock the function in jurisdictions where such actions are unlawful.
Musk published on X that he was “not familiar with any naked minor images produced by Grok. Actually absolutely no,” including that the system is developed to decline unlawful demands and abide by laws in every jurisdiction. Nevertheless, scientists discovered the main problem wasn’t completely naked images, however rather Grok positioning minors in exposing clothes like swimsuits and underclothing, in addition to sexually intriguing positions.
I not familiar with any naked minor images produced by Grok. Actually absolutely no.
Clearly, Grok does not spontaneously produce images, it does so just according to user demands.
When asked to produce images, it will decline to produce anything unlawful, as the operating concept … https://t.co/YBoqo7ZmEj
— Elon Musk (@elonmusk) January 14, 2026
Since January 15, about a 3rd of the sexualized pictures of kids recognized in the CCDH sample stayed available on X, regardless of the platform’s specified zero-tolerance policy for kid sexual assault product.
Daily Debrief Newsletter
Start every day with the leading newspaper article today, plus initial functions, a podcast, videos and more.
