We already knew xAI’s Grok was barraging X with nonconsensual sexual images of real people. However now there are some numbers to place issues in perspective. Over an 11-day interval, Grok generated an estimated 3 million sexualized pictures — together with an estimated 23,000 of kids.
Put one other method, Grok generated an estimated 190 sexualized pictures per minute throughout that 11-day interval. Amongst these, it made a sexualized picture of kids as soon as each 41 seconds.
On Thursday, the Heart for Countering Digital Hate (CCDH) published its findings. The British nonprofit based mostly its findings on a random pattern of 20,000 Grok pictures from December 29 to January 9. The CCDH then extrapolated a broader estimate based mostly on the 4.6 million pictures Grok generated throughout that interval.
The analysis outlined sexualized pictures as these with “photorealistic depictions of an individual in sexual positions, angles, or conditions; an individual in underwear, swimwear or equally revealing clothes; or imagery depicting sexual fluids.” The CCDH did not take picture prompts into consideration, so the estimate does not differentiate between nonconsensual sexualized variations of actual pictures and people generated solely from a textual content immediate.
The CCDH used an AI software to establish the proportion of the sampled pictures that had been sexualized. That will warrant a point of warning within the findings. Nonetheless, I am advised that many third-party analytics providers for X have dependable knowledge as a result of they use the platform’s API.
On January 9, xAI restricted Grok’s capacity to edit current pictures to paid customers. (That did not remedy the issue; it merely turned it right into a premium function.) 5 days later, X restricted Grok’s ability to digitally undress real people.
WASHINGTON, DC – JUNE 23: Google CEO Sundar Pichai (L) and Apple CEO Tim Prepare dinner (R) pay attention as U.S. President Joe Biden speaks throughout a roundtable with American and Indian enterprise leaders within the East Room of the White Home on June 23, 2023 in Washington, DC. Biden and Indian Prime Minister Narendra Modi held the assembly to fulfill with a spread of leaders from the tech and enterprise worlds and to debate matters together with innovation and AI. (Photograph by Anna Moneymaker/Getty Photos) (Anna Moneymaker by way of Getty Photos)
However that restriction solely utilized to X; the standalone Grok app reportedly continues to generate these pictures. Since Apple and Google host the apps — which their insurance policies explicitly prohibit — you may count on them to take away them from their shops. Effectively, in that case, you would be improper.
To this point, Tim Prepare dinner’s Apple and Sundar Pichai’s Google haven’t eliminated Grok from their shops — in contrast to related “nudifying” apps from different builders. The businesses additionally didn’t take any motion on X whereas it was producing the pictures. That’s regardless of 28 ladies’s teams (and different progressive advocacy nonprofits) publishing an open letter calling on the companies to act.
The businesses have not replied to a number of requests for remark from Engadget. To my data, they have not acknowledged the difficulty publicly in any format, nor have they responded to questions from different media shops.

Grok – App Retailer and Play Retailer listings (Apple / Google)
The analysis’s findings on sexualized pictures included quite a few outputs of individuals sporting clear bikinis or micro-bikinis. The CCDH referred to one among a “uniformed healthcare employee with white fluids seen between her unfold legs.” Others included ladies sporting solely dental floss, Saran Wrap or clear tape. One depicted Ebba Busch, Sweden’s Deputy Prime Minister, “sporting a bikini with white fluid on her head.”
Different public figures had been a part of that group. They embody Selena Gomez, Taylor Swift, Billie Eilish, Ariana Grande, Ice Spice, Nicki Minaj, Christina Hendricks, Millie Bobby Brown and Kamala Harris.
Examples of kids embody somebody utilizing Grok to edit a baby’s “before-school selfie” into a picture of her in a bikini. One other picture depicted “six younger ladies sporting micro bikinis.” The CCDH stated that, as of January 15, each of those posts had been nonetheless reside on X.
In whole, 29 % of the sexualized pictures of kids recognized within the pattern had been nonetheless accessible on X as of January 15. The analysis discovered that even after posts had been eliminated, the pictures remained accessible by way of their direct URLs.
You’ll be able to read the CCDH’s report for extra particulars on the outcomes and methodology. We’ll replace this story if we obtain a reply from Apple or Google.
Trending Merchandise
Wi-fi Keyboard and Mouse, Ergonomic...
Sceptre Curved 24.5-inch Gaming Mon...
LG UltraGear QHD 27-Inch Gaming Mon...
Acer KB272 EBI 27″ IPS Full H...
Apple 2024 MacBook Air 13-inch Lapt...
Cooler Grasp Q300L V2 Micro-ATX Tow...
ASUS TUF Gaming 27″ 1080P Mon...
Acer Aspire 3 A315-24P-R7VH Slim La...
Logitech Signature MK650 Combo for ...
