menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Labour’s deepfake crackdown sounds tough, but the tech, and the criminals using it, are already out of reach

5 0
yesterday

By Marcus Johnstone

The UK Government has now accelerated its attempt to tackle intimate image abuse involving women and children with the announcement of two new offences.

Creating or requesting the creation of non-consensual intimate images with AI will become a criminal offence this week expanding The Data (Use and Access) Act passed last year. And apps that allow users to create nude fake images of people will be criminalised by adding to the Crime and Policing Bill currently going through Parliament.

This is the Government’s ‘bold’ answer to those like myself concerned that any Ofcom action against Elon Musk’s X platform over its Grok AI tool will make little difference to the problem which has become the subject of such media hysteria in recent days. Even if as a result of its investigation, Ofcom applies for a ‘service restriction order’ against Grok, X will likely be able to continue its business with a fine and mitigating steps. It will no doubt say it does not knowingly allow people to generate sexual images and has systems in place to identify illegal images being shared and report them to the police, which it does.

The new legislative proposals do at least attempt to deal with the variety of apps out there that offer........

© LBC