Researchers from Cedars-Sinai developed a virtual reality (VR) AI mental health support tool named the eXtended-reality Artificial Intelligence Assistant (XAIA). 

The study from researchers from Cedars-Sinai, led by Brennan M. R. Spiegel, and published in Nature’s npj Digital Medicine, used AI, spatial computing, and VR to immerse users in calming, nature-inspired settings where they engage in therapeutic conversations with an AI avatar. 

The system used GPT-4 to supply immersive therapy sessions to 14 individuals experiencing mild to moderate anxiety or depression. XAIA may be accessed on the Apple Vision Pro VR headset.

Lead researcher Brennan Spiegel, MD, MSHS, wrote in a Cedars-Sinai blog: “Apple Vision Pro offers a gateway into Xaia’s world of immersive, interactive behavioral health support—making strides that I can only describe as a quantum leap beyond previous technologies.”

He continued, “With Xaia and the stunning display in Apple Vision Pro, we’re in a position to leverage every pixel of that remarkable resolution and the complete spectrum of vivid colours to craft a type of immersive therapy that’s engaging and deeply personal.”

To train the AI, Spiegel and his team incorporated transcriptions from cognitive behavioral therapy (CBT) sessions conducted by experienced therapists, specializing in empathy, validation, and effective communication. 

The AI’s responses were further refined through iterative testing, which involved therapists role-playing various clinical scenarios. This led to continuous improvement within the system’s psychotherapeutic communication.

Researchers used spatial computing, VR, and AI (GPT-4) to construct an immersive therapy chatbot. Source: Nature.

Participants discussed various topics with the AI, allowing researchers to document the AI’s application of psychotherapeutic techniques. By and huge, XAIA was noted for its ability to specific empathy, sympathy, and validation, enhancing the therapeutic experience. 

For example, XAIA’s empathetic response to a participant’s experience of feeling unnoticed was, “I’m sorry to listen to that you just felt rejected in such a definitive way, especially whenever you were pursuing what’s essential to you. It should have been a troublesome experience.”

Researchers performed a qualitative thematic evaluation of participant feedback, suggesting a general appreciation for the AI’s nonjudgmental nature and the standard of the VR environments. 

Some said XAIA could offer a worthwhile alternative to traditional therapy, especially for those searching for anonymity or who’re reluctant to have interaction in face-to-face sessions. 

Others highlighted the importance of human interaction and the unique advantages of connecting with a human therapist.

The study also identified areas for improvement, similar to the AI’s tendency to over-question participants or inadequately explore emotional responses to significant life events. 

Brennan Spiegel elaborated on the tool’s mission, clarifying, “While this technology shouldn’t be intended to interchange psychologists — but moderately augment them — we created XAIA with access in mind, ensuring the technology can provide meaningful mental health support across communities.”

It looks as if an interesting place to begin for a deeper exploration of immersive therapy environments, which could actually profit some who’re unable to access in-person therapy or wish to stay private and anonymous of their discussions.

AI for analyzing therapy conversations

In addition to acting because the therapist, AI has been used to investigate the dynamics of real therapy conversations. 

In a 2023 study, researchers used AI to peel back the layers of psychotherapy sessions, revealing how certain speech patterns might just be the important thing to understanding the bond between therapists and their patients.

The drive behind this research stems from a longstanding dilemma in psychotherapy: how can we accurately gauge and improve the therapeutic alliance? 

Published within the journal iScience, the study showed how personal pronouns and speech hesitations signal the depth of the therapist-patient connection.

This term refers back to the essential relationship between therapists and their patients, a foundation critical for effective therapy. 

Traditionally, understanding this relationship has been a subjective affair, counting on personal accounts and third-party observations, which, although worthwhile, might miss the fluid dynamics of actual therapy sessions. 

Researchers from the Icahn School of Medicine at Mount Sinai saw a chance to employ machine learning to make clear what makes therapeutic communication work.

The study took place at clinics in New York City, involving 28 patients and 18 therapists engaging in a wide range of therapy sessions. Before sessions kicked off, patients reflected on their past therapeutic relationships and attachment styles through online surveys.

Researchers used machine learning to investigate session transcripts using natural language processing (NLP), specializing in the usage of pronouns like “I” and “we” and non-fluency markers like “um” and “like.”

The way therapists and patients wielded personal pronouns appeared to affect the alliance. 

For instance, the study found that when therapists incessantly used “we,” it didn’t at all times enhance the alliance as one might expect, especially in cases involving personality disorders. This counters the same old assumption that inclusive language mechanically strengthens connections.

Moreover, either party’s overreliance on “I” was linked to lower alliance rankings, hinting on the potential pitfalls of an excessive amount of self-focus in therapy sessions. 

The authors wrote, “Our primary finding was that more frequent first-person pronoun usage in each therapists and patients (“we,” “i do,” “i feel”, “when i”) characterised sessions with lower alliance rankings.”

An unexpected finding was that hesitations, often viewed as a negative marker of conversation, were related to higher alliance rankings, suggesting that pause could foster authenticity and engagement.

Previous research has found that pauses are a key a part of genuinely thoughtful conversation. 

In the researchers’ words: “We found that higher non-fluency in patients (e.g., “is like,” “umm”), but not in therapists, characterised sessions with higher alliance rankings by patients.”

The researchers also cautioned that the study’s scope and observational nature mean these correlations aren’t completely reliable. 

AI has been used for speech evaluation in medical settings, similar to when UCL and University of Oxford researchers developed a model to detect potential schizophrenia from speech patterns. 

This article was originally published at dailyai.com