Home » Uncategorized

Your AI Shrink Will See You Now

Summary:  Where do we look to see the most advanced chatbots and the most complete application of AI?  Chatbots designed as ‘artificially intelligent psychological counseling chatbots’, ‘therapeutic assistants’ for short.

Your AI Shrink Will See You NowPerhaps a better title for this would be “how far have we come?”  About two months ago when we wrote the three part series on chatbots the most striking thing was just how fast and explosive this growth has been.  And that’s saying a lot in an industry that is characterized by exactly that, fast explosive growth.

The fact that originally caught my eye was this.  According to a survey of over 300 companies ranging from small to large performed at the beginning of 2017 by Mindbowser and Chatbots Journal:

  • 25% of businesses first heard of chatbots in 2015.
  • 60% of businesses first heard of chatbots in 2016.
  • 54% of developers first worked on chatbots in 2016.
  • 75% intended to build a chatbot in 2017.

If you read that earlier series or have studied chatbots independently you know that while they all use NLU (natural language understanding) as input and output techniques that the real frontier is in how they formulate their response.  Are they scripted (as almost all are) or are the responses generated on the fly by advanced deep learning algos like RNN/LSTMs?

So my real motive was to identify and understand what I expected to be the most advanced chatbots out there.  And I expected to find them in the category that’s come to be known as ‘psychological autonomous intelligent assistants’, or more popularly ‘artificially intelligent psychological counseling chatbots’, or just plain ‘therapeutic assistants’.

After all, what could offer greater complexity than understanding the human emotions that cause us to be anxious, depressed, or any of the other myriad of emotions that cause us not to be at the top of our game?  A chatbot that can understand the many ways we might express or unintentionally hide these feelings would have to be best-in-class at NLU.  And if it could actually create demonstrable improvement in our feelings and behaviors then it would have achieved something extraordinary in AI.

A point we also raised in that previous series is that short of AGI (artificial general intelligence) which is very far away, the most complex challenges are those imposed by long conversations in broad domains.  While psychological counseling isn’t a totally open knowledge domain, it certainly is one of the broadest and therefore most difficult.  And the conversations are by definition long. Your AI Shrink Will See You Now

Who’s in the Game

There are more available apps in this market than you might guess.  Here are those I was able to identify:

Ellie:  Ellie was never a commercial product but by dating to 2014, the DARPA-funded virtual therapist was the first of the modern versions.  The 239 participants showed conclusively that they were equally or even more willing to open up to a computer, removing the long held notion that a human-in-the-room was required.  Analysis of the text messages compared to videoconferencing therapy sessions also showed that texting to a computer actually reduced interpersonal anxiety meaning that patients were not impeded by guilt or embarrassment and were more able to frankly reveal and discuss the real issues.

Your AI Shrink Will See You NowWoebot:  Woebot receives by far the most press of any of the therapeutic assistants.  Developed by a Stanford team led by psychologist Alison Darcy, now CEO of Woebot Labs, Inc. it uses daily chat conversation and mood tracking to deal with problems of anxiety, depression, and other common maladies.  Using the Facebook Messenger platform, Woebot checks in on you once a day and is currently available at $39 per month.  Like most competitors, Woebot is based on cognitive behavioral therapy (CBT).  Woebot is the only platform to claim a peer reviewed clinical study to support its efficacy.

X2AI:  X2AI is the name of the company behind a fairly long list of therapeutic assistants.  It also seems to be furthest along towards commercialization having launched its first product in 2016 with a string of others that followed, culminating today in its most advanced platform TESS.  Its original offerings were targeted to a variety of overseas populations like Syrian refugees reflecting the background and personal concerns of its founders.  These apps include:

Karim:  This Arabic speaking platform was deployed to help Syrian refugees after the founders read that nearly half of the Syrian refugees living in the Za’atari camp in Jordan reported feeling so hopeless that they couldn’t function. Almost three-quarters of the Syrian refugees in the country disclosed one or more “problematic behaviors,” such as excessive nervousness, continuous crying, self-isolation, and trouble sleeping.  Karim’s NLP analyzes emotional state and returns appropriate comments, questions, and recommendations.

Emma:  Emma is a Dutch-language bot designed to help people with mild anxiety and fear.

Nema: Nema is an English-language bot that specializes in pediatric diabetes care.  Though not technically in the psychological therapy domain, it demonstrates how X2AI’s generalized platform can be adapted to other problem types.

Tess:  Tess is X2AI’s current and most advanced platform due for commercial rollout momentarily.  It uses CBT, motivational interviewing, and other techniques, and is reported to be advanced in its ability to identify emotional cues and context.  Most significantly, X2AI’s business model has morphed to focus on the professional psychologists and healthcare providers who subscribe to the platform and use Tess to augment their relation with the patient.  Tess also appears to be the first platform to be HIPPA compliant.

Your AI Shrink Will See You NowWysa: Wysa is the offering from US startup Touchkin.  Also based on CBT therapy approaches it currently has a wider following in India.  The husband and wife founders, Ramakant Vempati and Jo Aggarwal, released Wysa in January at a time when large scale layoffs were occurring in India’s tech industry and large numbers of tech professionals needed but were priced out of traditional in-person counselling.  Job loss and work issues are reported to be the second most common problem presented by users.  Wysa’s NLP responds with solutions framed by therapists.  It is free to users and Touchkin intends to monetize their development by licensing to enterprise customers, global insurers and healthcare providers.

Joy: Joy was one of the original releases in this category and is billed more as a digital friend than a counsellor.  Joy uses Facebook Messenger to track how you feel and what you did that day.  Unlike the other apps using CBT therapy approaches, Joy is more determinant.  For example if Joy detects you are feeling anxious it will provide specific tips for reducing anxiety.  Joy is reported to be focused on asking questions but comes up short on its response to many questions asked by users.

Uprise:  Uprise is a newly funded offering from Australia not yet in commercial release.

Focusing on Cognitive Behavioral Therapy (CBT) is Intentional

Your AI Shrink Will See You NowThere are two different decision models employed here.  The less sophisticated offers specific tools, tests, or advice based on its assessment of your situation.  The more sophisticated including Woebot and the X2AI offerings like Tess have intentionally built logic around CBT, cognitive behavioral therapy, which is today’s most popular form of ‘talk therapy’.

In brief, patients are encouraged to talk about their emotional response to current life events.  The approach asks users to restate their negative thoughts in a more positive manner.  The app then leads them to identify the psychological traps that cause their stress, anxiety, and depression. “A good CBT therapist should facilitate someone else’s process, not become a part of it,” says Alison Darcy of Woebot.

The result is that you can tell anything to the bot.  CBT and the AI in these platforms is not designed to tell you something about yourself you didn’t know.  It only knows as much as you reveal to it.

Neither is CBT directive.  It is going to take you on a process of self-discovery, not direct your behavior to get well.

By selecting CBT as the foundational logic, these developers have selected an approach that is squarely in the sweet spot for chatbots.  They can hold a structured conversation.  They can help you recognize and deal with the barriers that confront you in today’s world.  They are not going to perform Freudian dream analysis or draw inferences from your childhood relationship with your mother.

Emotion Detection is a Critical Component

Woebot’s Alison Darcy says the system has been designed to detect emotions like anxiety, sadness, and anger in language, and then use clinical decision-making tools to provide tips on how to better understand or even reframe those sentiments so that they don’t trigger a psychological downward spiral.

Baseline NLU is pretty good these days at detecting emotion but has a number of levels.  Simple sentiment analysis from the words used or the speed of typing would not be sufficiently sensitive.

All the platforms claim some advanced level of emotion detection that enhances their ability to select a conversational path and for all this is a work in progress.  X2AI claims the most advanced approach using deep learning to detect patterns based on phrasing, diction, typing speed, sentence length, grammatical voice (active versus passive), and other parameters that correlate with different emotional states.

The claim is that this allows their system to identify latent emotions, just as human therapists do. “When you say something in a certain way, a good friend will know how you actually feel,” Bann of X2AI said. “It’s the same thing with our A.I.s.”  Reportedly their proprietary approach relies on both manual coding of emotions and self-directed learning.

As with commercial chatbots, the ability to detect user frustration and suggest an exit, usually to a human is a best practice.  When dealing with this level of intimate therapy the need becomes significantly more urgent. 

It’s one thing to send a message to a distraught user suggesting that the user needs to consult a human therapist, but if the user’s conversation suggests self-harm or other extremely negative possibilities, then a message may not be sufficient.  In the X2AI model where the platform is an adjunct to the treatment being provided by a human therapist, human counsellors can step in immediately and take control of a conversation.  If done properly, the user may not even recognize the transition.

Where’s the AI?

All of these platforms currently use typed message NLU as input and output mechanisms.  Given the wide variety of ways in which people might describe the problem they are experiencing, the ability of all these platforms to parse meaning from wide ranging texts is impressive. 

In any counselling relationship it’s likely that the patient, especially early in the relationship, may dissemble or falsely state either the problem or the cause out of shame, guilt, or concern over being judged.  Studies have verified that patients have an easier time being candid with a computer than with a human counsellor, but it is still impressive that these platforms can discern correct cause and effect.

X2AI’s claim to use deep learning in detecting nuances in emotion and using that information to direct a response was the most sophisticated I could find.

What I expected to find but did not was the use of deep learning RNNs to formulate the actual responses.  All these platforms use scripted decision trees to define the branching of the conversation, and not generative answers that represent the leading edge of chatbot dialogue.  All the specific applications of CBT have been designed by qualified therapists and hand coded into chat trees.

Once again, X2AI claims the most advanced ability to update or modify the trees based on what it claims to be their more sophisticated ability to learn about the patient.  This highly granular emotion ontology is said to allow Tess to maintain a stateful relationship with the patient, remembering and taking into consideration all previous conversations to determine what to say next.  According to X2AI, “over time, Tess learns what the patient loves, and why; what scares them, what calms them down, and even what to say to the patient that will best help them cope through a period of depression, forming an emotional bond.”

Still the computational requirement of holding very long conversations has a practical limit.  Since lengthy conversations require that the bot hold more ‘thoughts’ in memory and correctly deal with the context and sequence of what has been said before (statefulness).  For example, Woebot is reported to limit interchanges to about 10 minutes per session.  This does not seem to have effected user satisfaction.

Some Legal and Ethical Issues

Your AI Shrink Will See You NowThe platforms are careful to label their applications as ‘assistants’ meaning specifically that the offer help or support rather than specifying treatment.  If you claim to actually ‘treat people’ then you have crossed over into the heavily controlled area of practicing medicine.

That doesn’t mean there is no risk or claim of harm to the user that could result in legal liability.  Interestingly, because ‘autonomous code’ cannot be found to have ‘moral agency’ it cannot be found criminally liable.  It could however be subject to civil product liability laws.  This seems to favor the business models that license the bot to health care providers who can define treatment and step in under certain circumstances to intervene.

So far only Tess claims HIPPA compliance.  Some have observed that using Facebook messenger actually means that Facebook may ‘own’ the contents of the conversation.  The concern being that Facebook may use the data to target this vulnerable population.  All the platforms deny this possibility as does Facebook who is reported to say that they have no such intent.

So How Far Have We Come?

I was hoping to find that these apps used generative RNN/LSTM algos to dynamically formulate responses which would represent state of the art.  That wasn’t true.

What I did find is that the most sophisticated of these are taking us into new areas of emotion detection and the correlation of those emotions to different courses of action.  At heart though, they remain elaborate decision trees.

We’ll come back in a year and see how they’ve progressed.

Other articles in this series:

Beginners Guide to Chatbots

Under the Hood With Chatbots

Chatbot Best Practices – Making Sure Your Bot Plays Well With Users

About the author:  Bill Vorhies is Editorial Director for Data Science Central and has practiced as a data scientist since 2001.  He can be reached at:

[email protected]