Elon Musk's Grok creates child sexual abuse images. Why won't the Government log off for good? writes Natasha Devon

By Natasha Devon MBE

It has come to light that users of Elon Musk’s X are using the platform’s AI tool Grok to create sexual abuse images of women and children.

The (inexplicable) defenders of this practice claim it is harmless because users are 'just putting women in bikinis'. Aside from whether we should find this acceptable if the woman in question hasn't consented (and how dangerous this could be for women who come from countries and cultures where partially nude online images have serious consequences), this is absolutely not the case.

The Internet Watchdog Foundation has confirmed they have found sexualised pictures of children which appear to have been created using Grok. Men have also used Grok to create images of women in degrading scenarios, including giving instructions to make them 'look scared' as they are tied up in the boots of cars.

The Times reported that a picture was generated of Bella Wallersteiner, a descendant of holocaust survivors, wearing a bikini outside Auschwitz. Grok-produced images of other Jewish women depict them in bikinis decorated with swastikas.........

© LBC