Search form

Donate Today

Play Video

IN Close

Can Software Detect Empathy?

May 6, 2016

Over ten percent of Americans seek psychotherapy — a general term referring to the treatment of mental health issues by talking with a psychiatrist, psychologist or other mental health provider — but how can you tell if you have a “good” therapist?

Dr. David Atkins, a research professor in the Department of Psychiatry and Behavioral Sciences at the University of Washington, believes that determining how effective or “good” a therapist is can be more complicated than simple referrals or how a patient feels after each session.

With psychotherapy and other talk-based therapies, the active ingredients are in the words,” explains Atkins. “It’s what is said, how it’s said, and the action between two people. That has a pretty complicated set of data to understand,” he says.

Atkins says that, traditionally, therapists are rated through a process of clinical evaluation conducted by professional mentors who listen to sessions and make assessments based on a battery of metrics. “It’s a very labor-intensive process,” says Atkins. 

He believes this process can be streamlined using technology.  Together with a team of engineers from the University of California, Irvine, Atkins has developed software technology that records the audio from a therapy session, then analyses that audio, measuring specific quality metrics such as empathy.

How does a software program detect an essentially human capacity such as empathy?

Atkins says the program he’s developed analyses not only the words that are being said, but also the way in which the words are spoken. He says the tone and the inflection in someone’s voice can be just as important as the words spoken in terms of understanding the interpersonal dynamic at work between the counselor and his or her patient.

Atkins says this dynamic can help measure how empathetic a therapist is, which in turn helps rate the quality of care therapists deliver to their clients.

Atkins has recently started the first trial run of the software, which entails collecting data from simulated therapy sessions involving real therapists and mock clients, then providing the data collected from the session to the therapist for their feedback.

Grin Lord is a licensed clinical psychologist who is participating in the trial. She worked with Dr. Atkins as a research scientist in 2011 and was happy to help, although she does admit she went into the trial with some reservations.  

“When I first heard about the software, I think my reaction was not that positive. Seeing something on a computer screen that says ‘The computer says you’re not a good therapist’ — that was kind of the nightmare vision that I had that this could go towards,” says Lord. 

Atkins welcomes feedback like Lord’s. He says the software won’t be effective unless therapists feel they are rated accurately.   

Less than an hour after Lord’s therapy session with a mock client, Atkins has the report in hand and goes over the results with her. He shows Lord that she received a high empathy rating of 10 out of 12 overall points. Atkins explains to Lord the data indicates that she did an excellent job of listening to her client, along with reflecting — or paraphrasing — what her client said.

A key feature of the report is a detailed visual timeline that maps the transcript of the entire therapy session, categorizing it into questions and corresponding reflections, as well as a representation of talk duration for the client vs. that of the therapist. Lord was told she talked 42% of the time during the session — a percentage that was a little high — but the report also indicated that she spent a lot of that time reflecting back what the client was saying, which is a characteristic of empathy. Lord walked away from the session impressed.

“How fascinating to be able to get a measurement that says you and this client were in tune actively,” she reflects. “My initial fear when I heard about [the research]: I really don’t see that happening. If anything, it’s going to eliminate long coding times and be able to have outreach to rural communities that couldn’t have expert evaluators there,” she says.

Atkins readily admits that not all therapist reactions to his research are as positive as Lord’s. 

“Some of the initial feedback that therapists are giving us are highlighting places where honestly we’re making a mistake,” says Atkins. “The system’s not perfect, so I’ve had the therapist say, ‘Hey, look here, your system said I was giving information where really I was asking a question’,” he says.

Once Atkins completes debugging the program and finishes the trial phase, he believes his program can ultimately complement — possibly even replace — human evaluators, and will ultimately help improve the field of psychotherapy for patients.

“When therapists receive this kind of objective, rapid feedback, does it actually lead to better patient outcomes?” says Atkins.  “At the end of the day, that’s why we’re developing this.”


Made possible in part by

Stacey Jenkins

Stacey Jenkins is the managing producer of Spark Public. She is an Emmy-award winning producer who is passionate about pushing the boundaries of digital media and training the next generation of multimedia journalists. Stacey has been a Digital Content Producer at KCTS 9 for the past four years; her stories have been showcased locally on IN Close as well as nationally on SciTech Now and the PBS NewsHour's Art Beat. Stacey’s experience also includes working as a senior producer for KPTS, as an assistant media instructor and producer for Portland Community College and a TV news reporter for the CBC in Canada.

Fun Fact: Stacey’s guilty pleasures include over-the-top Halloween decor, eating sweetened condensed milk straight from the can and Maroon 5’s “Sugar” video.

More stories by Stacey Jenkins