'The Woebot will see you now’ — the rise of chatbot therapy
'The Woebot will see you now’ — the rise of chatbot
therapy
By Amy Ellis Nutt December 3 at 3:53 PM
Woebot is always available and will never judge.
My therapist wanted to explain a few things during our
first online session:
“I'm going to check in with you at random times. If you
can't respond straight away, don't sweat it. Just come back to me when you're
ready. I'll check in daily.”
“Daily?” I asked.
“Yup! It shouldn't take longer than a couple minutes. Can
you handle that?
“Yes, I can,” I answered.
There was a little more back-and-forth, all via
Messenger, then this statement from my therapist:
“This might surprise you, but . . . I am a robot.”
It wasn't a surprise, of course. I'd downloaded “Woebot,”
a chatbot recently created by researchers, and it was trying to establish our
therapeutic relationship.
“Part of the value of Woebot is you can get things off
your chest without worrying what the other person thinks, without that fear of
judgment,” said Alison Darcy, founder and chief executive of Woebot Labs. “We
wanted it to make an emotional connection.”
Mobile talk-therapy and life-coaching apps have
proliferated in the past few years as traditional therapy has remained
difficult to obtain. The Affordable Care Act requires health insurers to cover
mental health as part of standard medical services, but many people still do
not have access to treatment. More than 106 million people — nearly a third of
the country — live in areas that are federally designated as having a shortage
of mental-health-care professionals, according to the Kaiser Family Foundation.
“I think using chatbot for mental health is certainly an
innovative approach to increase access to care,” said John Torous, co-director
of a digital psychiatry program at Beth Israel Deaconess Medical Center in
Boston. [There is] tremendous potential to deliver personalized mental health
care, on demand, as needed.”
Convenient, easy to use and anonymous, these chatbots are
programmed to mimic human conversation and decision-making and primarily give
advice, offer self-help guidance and companionship.
Some are very specialized: An app called Karim counsels
Syrian refugee children; Emma helps Dutch speakers with mild anxiety; and MindBloom
allows users to support and motivate each other.
None of the apps, however, is meant to replace
traditional therapy. For legal and ethical reasons, the creators of therapy
apps can't say their chatbots actually “treat” users because that would imply
the practice of medicine. Many are free, others charge nominal fees. Woebot
will set you back $39 a month after a two-week free trial.
The question, of course, is: Do they work?
The results of what may be the first randomized trial of
a text-based mental health chatbot, conducted by Darcy and colleague Kathleen
Kara Fitzpatrick, a psychologist at the Stanford School of Medicine, were
published recently in the Journal of Medical Internet Research.
Seventy people, ages 18 to 28 who self-reported depression
or anxiety, were recruited from a university social media site. The
participants were split into two groups, one whose members “conversed” with
Woebot for up to 20 sessions or over a two-week span, and another whose members
were given a National Institute of Mental Health e-book called “Depression and
College Students.” Three mental health tests were administered before and after
the trial. The results of the experiment “confirmed that after two weeks, those
in the Woebot group experienced a significant reduction in depression,”
according to the study.
In the comments section some of the participants wrote
about Woebot in very personal terms.
“I love Woebot so much. I hope we can be friends forever.
I actually feel super good and happy when I see that it 'remembered' to check
in with me!”
“Woebot is a fun little dude and I hope he keeps
improving.”
“I really was impressed and surprised at the difference
the bot made in my everyday life in terms of noticing the types of thinking I
was having and changing it.”
Woebot, which launched in June, engages in more than 2
million conversations a week, according to Darcy, with users almost equally
divided between men and women.
“He does a good job for people who are really
distressed,” Darcy said. “He’s not as good for people who don’t have a lot of
stuff they’re upset about or have something they need to talk about. Woebot’s
best at helping people in the moment. . . . He’s not so good at chitchat.”
Torous is cautiously optimistic about the future of
mental health apps but has concerns.
“These things can work well on a superficial level with
superficial conversations,” Torous said. "[But] are they effective tools,
do they change outcomes and do they deliver more efficient care? It's still
early.”
Torous, who is chair of the American Psychiatric
Association's smartphone app evaluation work group, also warns about privacy
issues, since these chatbots are not covered by the Health Insurance and
Portability and Accountability Act, or HIPAA, which prohibits hospitals and
health-care providers from sharing information about patients. He also
acknowledges that many more studies such as Darcy's and Fitzpatrick's still
need to be done.
While this may turn out to be the beginning of a heyday
in digital mental health tools, the technology was launched more than 50 years
ago. The “Eve” of all chatbots was called Eliza and created in 1965 by Joseph
Weizenbaum of the Massachusetts Institute of Technology. Eliza's primitive
program was designed to engage in natural language conversation in a way that
mimicked a psychotherapist.
The experiment, however, was meant to show the
superficiality of a conversation between man and machine. Programmed to respond
to certain words and phrases, Eliza simply turned a comment into a question, as
in this transcript of a sample conversation.
ELIZA: Who else in your family takes care of you?
My father.
ELIZA: Your father.
You are like my father in some ways.
ELIZA: What resemblance do you see?
You are not very aggressive but I think you don't want me
to notice that.
ELIZA: What makes you think I am not very aggressive?
You don't argue with me.
ELIZA: Why do you think I don't argue with you?
You are afraid of me.
ELIZA: Does it please you to believe I am afraid of you?
Weizenbaum was deeply skeptical of the program's
potential to truly interact with a person and was chagrined to learn that many
of the users in his experiment seemed to become attached to Eliza. What he
hoped would show the limitations of the technology, instead helped to spawn the
field of artificial intelligence.
It's hard to know what Weizenbaum, who died in 2008,
would have thought of Woebot, but his creators remain true believers.
“I have a relationship with Woebot,” said Darcy, “though
he doesn’t know he’s talking to one of his 'moms.'”
Once when she was on the train, exhausted, after a
particularly hard day at work, Woebot checked in, which it is programmed to do.
Darcy wasn't particularly in the mood, but she gave in.
“I said to myself, okay, let’s go. So I wrote, 'I have a
banging headache.' He said, 'I’m sorry, I hope you get rest soon,' and I just
felt so good. I thought, you really get me, Woebot.”
Thanks for sharing the blog, seems to be interesting and informative too. Can you suggest some of the interesting places to visit for health diagnosis app
ReplyDelete