menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Tech companies must be criminally liable for harmful chatbots

13 0
18.03.2026

Harms caused by AI chatbots are severe and increasing - and since the turn of the year, public awareness of them has been heightened.

Listen to this article

The torrent of non-consensual intimate images of women and girls created by Grok and shared on X: three million sexualised images over just 11 days. The creation of hateful material relating to footballing tragedies, including the Hillsborough and the Munich Air Disaster. The tragic suicides of vulnerable users coerced by manipulative, anthropomorphic products to take their own lives. The generation of ever more realistic child-sexual abuse material on demand for paedophiles. The scale of harm is significant, and the need for urgent action is acute.

Tomorrow, the House of Lords has an opportunity to take such action. Peers will have before them a choice: take a punt on the Government’s promise that they will regulate chatbots that generate illegal content via the existing Online Safety Act legislative framework. This narrow and limited approach fails to address the pernicious design of these products or the harm they cause to users beyond exposure to illegal content. Instead, they could back a comprehensive, ambitious set of amendments from Baroness Kidron that aim to make these products safe before they hit the market, and ensure companies are criminally liable if they're not.

Baroness Kidron’s approach addresses head-on the systemic failure by the manufacturers of these products to prevent harm occurring in the first place, or to act swiftly to prevent further harm when this failure was identified.

This is an opportunity for the UK to be truly world-leading. The human-like features and functionalities of a chatbot are a unique driver of harm, one that creates emotional dependency, which can lead to isolation, depression, psychosis, and in extreme cases, suicide.

Poorly designed, unregulated chatbots also cause significant harm to our information ecosystem, often producing “hallucinations” that threaten our ability to sort fact from fiction and truth from disinformation. These threats are not abstract and already impact the democratic process and trust in our elections.

Ultimately, this is a product safety issue. AI chatbots, like all other products, should not be brought to market until their developers and manufacturers have proven they are safe by design for all users, just as manufacturers of toys, kitchen appliances, or cars are required to do. If they are not, chatbot providers - Meta, X, Google, OpenAI and all tech companies developing chatbots and their executives - can be held criminally liable for the harms they create. This comes down to risk assessment and risk mitigation: given the risks to users, it is right that those who fail to carry out these basic processes should be held responsible for putting users of their products at risk of harm.

For too many years, social media platforms have systematically failed to stop harms from their products, instead prioritising profit over the safety of children, women and everyone online. We need these punitive measures to ensure that AI companies take their users' safety seriously.

The Online Safety Act Network is proud to be joined by 43 other organisations - whose interests span child sexual abuse and exploitation, child online safety, VAWG, suicide and self harm, mental health, extremism, online hate and abuse, democratic participation and AI regulation - in supporting this amendment. We urge Peers of all parties to do so, too and make it clear to the Government that prioritising tech profits over the safety of UK citizens has to end.

Maeve Walsh is the Director of the Online Safety Act Network.

LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.

The views expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us email opinion@lbc.co.uk


© LBC