3d models of two women in psychotherapy session sitting at wooden table
Two cameras record a psychotherapy session. The AI analyses the video sequences.

Image source: University of Basel; adapted from: Zimmermann et al., Frontiers in Psychology 2021 (CC BY 4.0)

News • Algorithm detects emotional state

AI face detection as support for psychotherapy

Artificial intelligence (AI) can reliably detect emotions based on facial expressions in psychotherapeutic situations.

These are the findings of a feasibility study by researchers from the Faculty of Psychology and the University Psychiatric Clinics (UPK) at the University of Basel. The AI system is also able to reliably predict therapeutic success in patients with borderline personality pathology. The results were published in the journal Psychopathology

The face is a mirror for a person’s emotional state. The interpretation of facial expressions as part of psychotherapy or psychotherapeutic research, for example, is a very effective way of characterizing how a person is feeling in that particular moment. Back in the 1970s, psychologist Paul Ekmann developed a standardized coding system to assign basic emotions such as happiness, disgust or sadness to a facial expression in an image or video sequence. “Ekman’s system is very widespread, and represents a standard in psychological emotion research,” says Dr. Martin Steppan, psychologist at the Faculty of Psychology at the University of Basel.

We were really surprised to find that relatively simple AI systems can allocate facial expressions to their emotional states so reliably. Nevertheless, therapeutic work is still primarily about human relationships, and remains a human domain

Martin Steppan

But the process of analyzing and interpreting recorded facial expressions as part of research projects or psychotherapy is extremely time-consuming, which is why psychiatry specialists often use less reliable, indirect methods such as skin conductance measurements, which can also be a measure of emotional arousal. “We wanted to find out whether AI systems can reliably determine the emotional states of patients in video recordings,” says Martin Steppan, who developed the study together with emeritus Professor Klaus Schmeck, Dr. Ronan Zimmermann and Dr. Lukas Fürer from the UPK. 

The researchers used freely available artificial neural networks that were trained in the detection of six basic emotions (happiness, surprise, anger, disgust, sadness and fear) using over 30,000 facial photos. This AI system then analyzed video data from therapy sessions with a total of 23 patients with borderline personality pathology at the Center for Scientific Computing at the University of Basel. The high-performance computer had to process over 950 hours of video recordings for this study.

Design for image processing. 1 = to reduce error, a region of interest (ROI)...
Design for image processing. 1 = to reduce error, a region of interest (ROI) can be defined where a face is detected with high likelihood which is expanded if face detection fails; 2 = faces are detected and transformed into 48 × 48 pixels grayscale pictures; 3 = the convolutional neural net had been trained using the FER-2013 dataset.

Image source: Steppan et al., Pyschopathology 2023 (CC BY-NC 4.0)

The results were astonishing: statistical comparisons between the analysis of three trained therapists and the AI system showed a remarkable level of agreement. The AI system assessed the facial expressions as reliably as a human but was also able to detect even the most fleeting of emotions within the millisecond range, such as a brief smile or expression of disgust. These types of micro expressions have the potential to be missed by therapists or may only be perceived subconsciously. The AI system is therefore able to measure fleeting emotions with an increased level of sensitivity compared to trained therapists. 

The AI analysis also uncovered something rather unexpected. Patients who demonstrated emotional involvement and smiled at the start of a therapy session went on to cancel their psychotherapy less often than people who seemed emotionally uninvolved with their therapist. This “social” smiling could therefore be a good predictor of therapy success in a person with symptoms of borderline personality pathology.

“We were really surprised to find that relatively simple AI systems can allocate facial expressions to their emotional states so reliably,” says Martin Steppan. AI could therefore become an important tool in therapy and research. AI systems could be used in the analysis of existing video recordings from research studies in order to detect emotionally relevant moments in a conversation more easily and more directly. This ability could also help support the supervision of psychotherapists. “Nevertheless, therapeutic work is still primarily about human relationships, and remains a human domain,” says Steppan. “At least for the time being.” 


Source: University of Basel

03.01.2024

Related articles

Photo

News • LLM-based mental health detection

AI model could help prevent suicide in hospital patients

Large language models (LLM) show promise in detecting hospital patients at risk of committing suicide. This could help warn medical staff in time while maintaining the patients' privacy.

Photo

News • Interstital lung disease

Lung cancer: PET/CT imaging analysis with AI predicts immunotherapy side effects

PET/CT image analysis using artificial intelligence (AI) can predict the occurrence of interstitial lung disease, a serious side effect of immunotherapy in lung cancer, a new study shows.

Photo

News • Predicting bloodstream infections

AI to tackle antimicrobial resistance, sepsis, in the ICU

UK scientists are harnessing the power of AI to assess the antimicrobial resistance of patients in intensive care units (ICUs) and identify sepsis-causing bloodstream infections.

Related products

Subscribe to Newsletter