The Chatbot will see you now or will it?

Written by arkin-dharawat | Published 2017/12/27
Tech Story Tags: artificial-intelligence | bots | mental-health | therapy | chatbots

TLDRvia the TL;DR App

Tell me, how does this sound: local weather updates, info on your flight, the latest news and much more all available in one place. In one place on your phone. Pretty cool right? Well, the coolness increases as there are no humans involved. Only you and a program that can understand and respond to certain phrases. These, my friend, are chatbots.

In today’s world, the best way for a company to reach its consumer is through social platforms. According to an Oracle infographic, 90% of businessesuse Facebook to respond to service requests. As Deep Learning and NLP advanced so did the prospect of having a bot that would talk to the consumers. This same advancement has helped us apply automation in every place possible. So why not automate therapy as well?

Mean number of mentally unhealthy days during past 30 days among adults aged ≥18 years

Given the rise in mental illness in the United States and Worldwide and the lack of mental health infrastructure in some countries. We need to ask, Can chatbots provide a temporary (or permanent) form of therapy? Does it sound bizarre? Well maybe not.

The Past

In 1950’s Alan Turing published his now famous paper, ‘Computing Machinery and Intelligence’. To measure a machine’s intelligence, he designed the Turing test. If a machine can fool a human, in a conversation, to believe it to be human then it passes the test. In 1966, Joseph Weizenbaum, a professor at MIT Artificial Intelligence Laboratory, took interest in this. He made Eliza, the first chatbot. It was designed to fool people into believing that they were talking with a therapist rather than a bot. It worked by recognizing words from the input to reproduce a response. Then responded with certain keywords from a list of pre-programmed responses. Although with our current understanding of language we can build more complicated conversation interfaces, Eliza is still impressive. Other chatbots such as PARRY and JABBERWACKY have come along, but these weren’t made to simulate therapists.

The Present

However, we have come a long way from Eliza. Here are some of today’s therapy bots:

  • Nadia: Soul Machines, a New Zealand based company that made Nadia. Nadia is an animated human female that interacts with the user via a webcam. Actress Cate Blanchett has voiced Nadia and Dr. Mark Sagar has designed her. Her emotional intelligence, helps her look at the user’s face and connect with them on an emotional level. Although, she is a chatbot she is not a therapist. Nadia is there to support and help patients navigate the Australian NDIS(National Disability Insurance Agency). You can see her in action here. But, the government has stalled her development due to certain risks related to high-tech projects in the past.
  • Therachat: Not a chatbot, but a worthy mention since this article is about mental health. Therachat is an application designed for psychotherapist and client communication. It has various features like assigning homework, emotion tracking, scheduling and direct messaging. Although, chatbots are on the rise Therachat’s ambition to enhance the therapist experience is also valuable.
  • Ellie: Made by USC’s Institute for Creative Technologies, Ellie is a bot like Nadia. Funded by DARPA, she is part of a virtual reality program called SimSensei. She has a webcam and a sensor which she uses to read emotion and communicate. Her users are veterans with depression or PTSD (post-traumatic stress disorder). People are less open to revealing personal information to a human as compared to a bot, hence her purpose. And as this report reads, it seems to work! But she warns her users that she isn’t a therapist but here to talk.
  • Xiaoice: Built by Microsoft and hosted by Weibo, this chatbot is closer to a best-friend than a therapist. Microsoft built her by mining the Chinese internet for human conversations. She even uses emojis! Compared to Tay, whose racism got it suspended, Xiaoice has a much calmer personality. The Chinese user is open to talking to it about their breakups, family problems and anything else. It saves certain data from past conversations so it can bring them up in the future. The latter part has users doubting if they can trust it with their personal information. Given the amount of internet censorship in China, it is not surprising that she avoids sensitive political questions.

An example conversation

  • X2AI: This isn’t a chatbot, but a startup that makes psychotherapeutic chatbots. They have made bots like Tess, “a psychological AI that administers highly personalized psychotherapy, psycho-education, and health-related reminders, on-demand”. Other include Emma, a Dutch-language bot designed to help people with mild anxiety. And Karim, a bot built to counsel Syrian refugees. This is clever. The cost of building a bot is far lower than sending a group of psychotherapists to the war zone. Also, since people have to open up to a bot they aren’t worried about social stigma. But, we can discuss it’s effectiveness for another time.
  • Woebot: I left the best for last. Woebot has a lot going for it apart from Andrew Ng coming as a part of its board of directors. You can read his announcement here. Woebot isn’t a licensed therapist but he’s being used to target depression. The reason is the same: a growing mental disorder and stigma behind it. Woebot tries to understand what the user is saying and presents them with fun videos and good advice (try it!).

The Future

It isn’t hard to notice the advantages of a therapeutic chatbot. When we don’t talk to a human, we are more likely to open up about our problems. These bots can never replace psychotherapists but they can improve the experience. As we saw with Therachat, helping the therapist in treating the patient is itself a big step. A mental health crisis and lack of infrastructure might make bots our saviors. We can have a world where everyone has their personal therapist. Like Xiaoice, they can talk to it without fear of being judged or ostracized. In extreme cases, the bot can contact family/spouse or send them to a proper psychotherapist. We shouldn’t hope for an all-around general therapist. Instead, what we might end up getting is a bunch of bots that help with different disorders. Take games for example. AI’s like DeepBlue and AlphaGo are far better at playing games than humans. But the same human that plays Chess can also play Go. As humans, we can combine all these things and do them and yet an AI cannot. Another example: a household bot. We dreamt of having a general purpose bot. A bot that could clean everything from clothes to dishes to the carpet. What we instead got was a Roomba, a dishwasher, and a washing machine. Perhaps that is where the future of psychotherapy lies. In different bots helping millions, open up about their issues and improving their lives.


Written by arkin-dharawat | My interests include Machine Learning and Writing
Published by HackerNoon on 2017/12/27