Grok AI Generated 3 Million Explicit Images Of Users Without Their Consent: Report

Grok AI has been generating explicit images of women and kids for over a week and many have been outraged by its lack of consent.

3 million is a staggering number of explicit images from Grok

Elon Musk-owned Grok AI has been accused of generating over 2.5 million sexually explicit images of women and children in just a few days before the feature was disabled. The AI tool from Grok raised a lot of eyebrows and outrage as anybody could ask the AI chatbot to alter real people images of women or kids just asking Grok AI with a simple prompt and much to everyone’s shock, the AI obliged kindly.

That’s not all, the images were publicly shared which meant millions could see the explicit images and women were aghast at the behaviour of the Musk-owned AI chatbot.

Grok AI In Big Trouble

The new AI tool was built into the X (formerly Twitter) app which meant millions could try out the feature on anybody’s photo without any repercussions. This went on for a few days, and after a large-scale outburst various government authorities intervened and quizzed the company for its irresponsible and legally perplexing behaviour that too making images without taking consent of the innocent victims.

And now, as highlighted by a cyber hate watchdog, Center for Countering Digital Hate (CCDH), Grok AI is recorded to have generated around 3 million images of women in bikini and other explicit nature.

Nightmare For Kids

The body claims out of these 23,000 images were of children, which raises further sanctions on Musk and his AI company and the charges are serious. “The AI tool Grok is estimated to have generated approximately three million sexualized images, including 23,000 that appear to depict children, after the launch of a new image editing feature powered by the tool on X.”

Meanwhile, Musk, in a post on ,X had said that he was “not aware of any naked underage images generated by Grok. Literally zero.”

The data is clear: Elon Musk’s Grok is a factory for the production of sexual abuse material,” as said by Imran Ahmed, chief executive at CCDH. “By deploying AI without safeguards, Musk enabled the creation of an estimated 23,000 sexualized images of children in two weeks, and millions more images of adult women.”

Exit mobile version