menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

My friends in Italy are using AI therapists. But is that so bad, when a stigma surrounds mental health?

15 13
21.01.2026

It’s a sunny afternoon in a Roman park and a peculiar, new-to-this-era kind of coming out is happening between me and my friend Clarissa. She has just asked me if I, like her and all of her other friends, use an AI therapist and I say yes.

Our mutual confession feels, at first, quite confusing. As a society, we still don’t know how confidential, or shareable, our AI therapist usage should be. It falls in a limbo between the intimacy of real psychotherapy and the material triviality of sharing skincare advice. That’s because, as much as our talk with a chatbot can be as private as one with a human, we’re still aware that its response is a digital product.

Yet it surprised me to hear that Clarissa’s therapist has a name: Sol. I wanted mine to be nameless: perhaps, not giving it a name is consistent with the main psychoanalytical rule – that is, to keep personal disclosure to a minimum, to protect the healing space of the so-called setting.

However, it feels very natural to Clarissa for her therapist to have a name, and she adds that all her other friends’ AI therapists have one. “So do all your other friends have AI therapists,” I ask, to which she says: “All of them do.” This startles me even more, as none of my friends in London has one.

I phoned another friend, a psychotherapist in my Sicilian home town of Catania, who a few years ago retired from a role at a provincial health authority and is now working in a private capacity. He confirmed that the use of AI therapists in Italy is widespread and on the rise. He was surprised to hear........

© The Guardian