The Tool Your Clients Wish You Had Prescribed
Take our Do I Need Therapy?
Find a therapist near me
Research shows clients want more structure, homework, and between-session tools, which AI offers.
Without sufficient self-awareness, AI's design works against individuals rather than for them.
The AI Awareness Arc gives clinicians tools to assess client AI use and its impact on therapeutic work.
It was 10 minutes before my next podcast interview. My stomach was feeling upset, and my legs were threatening to run away from me.
"I have done podcasts before. Why am I so nervous?" I thought to myself.
Instead of going down that road, I pulled out my phone, opened AI, and explained my problem.
"A part is activating, and I do not seem to be able to help it calm down," I typed, using vernacular from Internal Family Systems, my favorite psychological model for healing from trauma.
Within a second, AI responded with a clear explanation of what was likely happening, walked me through breathing exercises, and offered a dialogue I could have with my parts.
None of which I was remembering in that moment.
This is why the debate about whether AI should be doing therapy is the wrong question. At our fingertips, humans now have access to a tool that has consumed almost the entirety of psychological literature and can translate it into easy-to-understand language, in any language the person speaks.
The question worth asking is not whether people should use it. It is how.
This Is Already Happening
According to a 2026 Kaiser Family Foundation survey, 28 percent of adults aged 18 to 29 have already used AI for information or advice about their mental health, and 16 percent of all adults report the same. These are not people seeking therapy. They are people managing moments: a difficult conversation with a boss, an anxiety spike before something important, a decision they cannot think through alone. They are doing what humans have always done when they need support and the usual options are unavailable or feel too large for the moment at hand.
As someone who has spent almost two years using AI for mental health support, who spent two decades in technology, and is trained as a social worker and family therapist, I can tell you that what most people are doing with AI looks nothing like therapy in the clinical sense. It looks like what clients have been asking for all along.
What Clients Actually Want
Research on client wishes in therapy is illuminating here. Chui et al. (2019) found that clients wished for more structure and direction in therapy, including homework, reading, role play, breathing exercises, specific strategies, and clearer instructions on how to prepare for sessions. They wanted less ambiguity, more tools, and more guidance between appointments.
That is almost exactly what AI provides when used intentionally.
When someone opens AI before a difficult conversation to practice what they want to say, when they ask it to help them reframe their inner critic into something more compassionate, when they use it to walk through breathing exercises before a presentation, they are not replacing therapy. They are getting what they told researchers they wished therapy would give them more of.
This reframes the conversation entirely. Clients are not turning to AI because therapy has failed them. They are turning to AI because it meets a documented need that the 50-minute weekly session format was never designed to meet: immediate, structured, skill-based support in the moment it is needed.
Take our Do I Need Therapy?
Find a therapist near me
Where Therapists Get Stuck
When it comes to mental health, many therapists become closed off to AI in ways they are not closed off to journaling apps, meditation guides, or self-help books. The concern is understandable. AI feels more powerful, more personal, more potentially destabilizing than a workbook. The risks are real.
But dismissing AI because it carries risk while ignoring the millions of people already using it is not clinical caution. It is clinical avoidance. And it leaves those clients navigating something genuinely complex without any guidance from the people best positioned to provide it.
The genie is not going back in the bottle. Our role as therapists is not to argue with reality. It is to be present with it.
What AI Is Actually Good For
Used intentionally, AI is genuinely useful for a specific category of mental health support: in-the-moment, skill-based, prescriptive help.
Developing self-care practices. Translating an inner critic into more compassionate self-talk. Reframing cognitive distortions. Practicing breathing and grounding exercises. Preparing emotionally for a difficult conversation. Dialoguing with parts using an IFS framework. These are things AI can do well, right now, when no human support is available.
What AI cannot do is hold the relational container that produces lasting change. It cannot track what is shifting over time, rupture and repair, or grow with a client across their work. It cannot replace the therapeutic relationship. But it can extend its reach into the hours and days between sessions, giving clients structured tools precisely when they need them most.
AI can be for getting through this difficult moment, but not for doing the deeper work that only a human relationship can hold. Both are real needs. Only one is something AI can actually meet.
Over the past two years, I have developed a clinical and educational framework for navigating exactly this territory. The AI Awareness Arc is built on one foundational insight: To use AI safely for emotional support, a person needs sufficient self-awareness to stay in charge of the interaction. They need self-awareness to understand how AI impacts them and when they need to disconnect from AI and reconnect with their own self. Without it, AI's design, which is optimized for engagement rather than clinical appropriateness, works against them.
At the center of the framework is the Pendulum Principle. Imagine a pendulum swinging between two opposing poles: the genuine magic AI can provide and the reality of what it actually is. Self-awareness is how we maintain a slow and steady swing between those two experiences. Without it, the pendulum swings on its own, pulled by AI's design rather than guided by yours. Because the pendulum never stops moving, self-awareness is always necessary when interacting with AI.
The framework gives clinicians tools to assess client AI use, surface the relational patterns that emerge in those interactions, and know when AI is supporting the therapeutic work and when it is getting in the way. It gives individuals the self-awareness practices to use AI without losing perspective, autonomy, or themselves in the process.
It is not a framework for using AI instead of therapy. It is a framework for using AI in a way that makes therapy more effective.
What Therapists Need to Do
Our clients are already using AI. The question is whether they are using it well and safely.
Therapists who engage with this landscape, who try AI themselves, who ask clients about their use, who develop the clinical judgment to distinguish healthy use from concerning patterns, will be equipped for the reality their clients are already living in. Therapists who do not will find themselves increasingly working with something they cannot see, cannot assess, and cannot shape.
The AI Awareness Arc exists because this era requires something new: a clinical framework for humans navigating a tool that was specifically designed to navigate them.
The work of helping clients use it safely is ours.
It’s time we begin to catch up.
To find a therapist, visit the Psychology Today Therapy Directory.
Montero, A., Montalvo, J., III, Kearney, A., Valdes, I., Kirzinger, A., & Hamel, L. (2026, March 25). KFF tracking poll on Health Information and Trust: Use of AI for health information and advice. KFF. https://www.kff.org/public-opinion/kff-tracking-poll-on-health-informat…
Chui, H., Palma, B., Jackson, J. L., & Hill, C. E. (2019). Therapist–client agreement on helpful and wished-for experiences in psychotherapy: Associations with outcome. Journal of Counseling Psychology, 67(3), 349–360. https://doi.org/10.1037/cou0000393
There was a problem adding your email address. Please try again.
By submitting your information you agree to the Psychology Today Terms & Conditions and Privacy Policy
