The Government staying on X is pathetic |
Sometimes politics is a simple business. Sometimes you can put aside all the complexity and accept a rudimentary conclusion. Here’s one: you shouldn’t use websites that create child abuse images. This seems a very simple and defensible argument, and yet to some it is apparently insufficiently compelling.
The issue concerns X.com, the site formerly known as Twitter, which Elon Musk has turned into an exploration digger for the moral abyss. His latest experiment involves Grok, his little AI toy. For some time now, users of X have been able to ask Grok to create images of real people stripped of their clothing. Over the new year, this function exploded in popularity.
Needless to say, given the kind of audience Musk has encouraged on X, it was deployed as abusively as possible. Female users of the site were sent non-consensual images of themselves in their underwear. When they protested, there was a fresh wave of abuse and more images, in ever more degrading positions, totaling 6,000 requests per hour. Soon, users began requesting images of the women tied up and gagged, or bruised, or covered in blood. AI-generated images of children also appeared.
This is all uncomplicatedly illegal. Amendments to the Sexual Offences Act criminalise the sharing of non-consensual intimate images or child sexual abuse material. Ofcom has