Several recent anecdotes describe therapists using Artificial Intelligence, most often ChatGPT, during or after sessions without informing clients. One patient, Declan, discovered his therapist was feeding their live session into ChatGPT after a technical mishap revealed the therapist´s screen. Others reported receiving polished messages that later appeared to contain AI prompts, leaving them surprised and distrustful. Confrontations with therapists produced apologies in some cases but also emotional fallout and, for some, ended the therapy relationship.
The article cites research that complicates the picture. A 2025 study in PLOS Mental Health found ChatGPT responses could conform better to therapeutic best practice in vignettes and were often indistinguishable from human replies, but participants who suspected AI authorship rated those responses lower. A 2023 Cornell study similarly found that AI-generated messages can increase closeness only when recipients are unaware of the tool´s role. Clinicians and researchers, including Adrian Aguilera at the University of California, Berkeley, argue that transparency and prior consent are essential if therapists intend to use Artificial Intelligence for drafting communications or generating ideas.
Beyond trust, privacy and safety are central concerns. Experts note general-purpose chatbots like ChatGPT are not HIPAA compliant and can expose sensitive information. Pardis Emami-Naeini of Duke University warns that seemingly innocuous details can allow reidentification, and that protecting patient data requires time and expertise that may defeat the convenience of these tools. The article also references specialized vendors such as Heidi Health, Upheal, Lyssn, and Blueprint that claim HIPAA compliance, while cautioning that any recording or storage of sessions carries leakage risk. Past incidents, including a 2020 hack of a mental health provider in Finland, are cited as warnings about the consequences of data breaches. The piece concludes that although Artificial Intelligence can offer efficiency and communication benefits for busy or burnt-out therapists, undisclosed use risks damaging the therapeutic relationship and may produce clinical errors if therapists rely on AI for judgment rather than using it transparently and sparingly.