3d models of two women in psychotherapy session sitting at wooden table
Two cameras record a psychotherapy session. The AI analyses the video sequences.

Image source: University of Basel; adapted from: Zimmermann et al., Frontiers in Psychology 2021 (CC BY 4.0)

News • Algorithm detects emotional state

AI face detection as support for psychotherapy

Artificial intelligence (AI) can reliably detect emotions based on facial expressions in psychotherapeutic situations.

These are the findings of a feasibility study by researchers from the Faculty of Psychology and the University Psychiatric Clinics (UPK) at the University of Basel. The AI system is also able to reliably predict therapeutic success in patients with borderline personality pathology. The results were published in the journal Psychopathology

The face is a mirror for a person’s emotional state. The interpretation of facial expressions as part of psychotherapy or psychotherapeutic research, for example, is a very effective way of characterizing how a person is feeling in that particular moment. Back in the 1970s, psychologist Paul Ekmann developed a standardized coding system to assign basic emotions such as happiness, disgust or sadness to a facial expression in an image or video sequence. “Ekman’s system is very widespread, and represents a standard in psychological emotion research,” says Dr. Martin Steppan, psychologist at the Faculty of Psychology at the University of Basel.

We were really surprised to find that relatively simple AI systems can allocate facial expressions to their emotional states so reliably. Nevertheless, therapeutic work is still primarily about human relationships, and remains a human domain

Martin Steppan

But the process of analyzing and interpreting recorded facial expressions as part of research projects or psychotherapy is extremely time-consuming, which is why psychiatry specialists often use less reliable, indirect methods such as skin conductance measurements, which can also be a measure of emotional arousal. “We wanted to find out whether AI systems can reliably determine the emotional states of patients in video recordings,” says Martin Steppan, who developed the study together with emeritus Professor Klaus Schmeck, Dr. Ronan Zimmermann and Dr. Lukas Fürer from the UPK. 

The researchers used freely available artificial neural networks that were trained in the detection of six basic emotions (happiness, surprise, anger, disgust, sadness and fear) using over 30,000 facial photos. This AI system then analyzed video data from therapy sessions with a total of 23 patients with borderline personality pathology at the Center for Scientific Computing at the University of Basel. The high-performance computer had to process over 950 hours of video recordings for this study.

Design for image processing. 1 = to reduce error, a region of interest (ROI)...
Design for image processing. 1 = to reduce error, a region of interest (ROI) can be defined where a face is detected with high likelihood which is expanded if face detection fails; 2 = faces are detected and transformed into 48 × 48 pixels grayscale pictures; 3 = the convolutional neural net had been trained using the FER-2013 dataset.

Image source: Steppan et al., Pyschopathology 2023 (CC BY-NC 4.0)

The results were astonishing: statistical comparisons between the analysis of three trained therapists and the AI system showed a remarkable level of agreement. The AI system assessed the facial expressions as reliably as a human but was also able to detect even the most fleeting of emotions within the millisecond range, such as a brief smile or expression of disgust. These types of micro expressions have the potential to be missed by therapists or may only be perceived subconsciously. The AI system is therefore able to measure fleeting emotions with an increased level of sensitivity compared to trained therapists. 

The AI analysis also uncovered something rather unexpected. Patients who demonstrated emotional involvement and smiled at the start of a therapy session went on to cancel their psychotherapy less often than people who seemed emotionally uninvolved with their therapist. This “social” smiling could therefore be a good predictor of therapy success in a person with symptoms of borderline personality pathology.

“We were really surprised to find that relatively simple AI systems can allocate facial expressions to their emotional states so reliably,” says Martin Steppan. AI could therefore become an important tool in therapy and research. AI systems could be used in the analysis of existing video recordings from research studies in order to detect emotionally relevant moments in a conversation more easily and more directly. This ability could also help support the supervision of psychotherapists. “Nevertheless, therapeutic work is still primarily about human relationships, and remains a human domain,” says Steppan. “At least for the time being.” 


Source: University of Basel

03.01.2024

Read all latest stories

Related articles

Photo

News • Machine learning in sample analysis

Lab-trained pathology AI meets real world: ‘mistakes can happen’

AI models are highly capable in analysing tissue samples – as long as conditions are lab-perfect. Add a little contamination, however, and diagnostic accuracy goes out the window, a new study shows.

Photo

News • Patchy recommendations

Errors and half-truths plague cancer treatment plans generated by ChatGPT

Experts from Brigham and Women’s Hospital have tasked ChatGPT to generate recommendations for cancer treatment – with some promise, but ultimately inadequate results.

Photo

News • Study shows lack of reliability

Can an AI predict Covid-19 from cough sounds? Not really, study finds

Predicting a Covid-19 infection from the sound of a cough? Researchers found that technology using Machine Learning performed no better than simply asking people to report their symptoms.

Related products

Subscribe to Newsletter