Skip to main content

Mental health app Wysa raises $5.5M for ’emotionally intelligent’ AI

It’s hard enough to talk about your feelings to a person; Jo Aggarwal, the founder and CEO of Wysa, is hoping you’ll find it easier to confide in a robot. Or, put more specifically, “emotionally intelligent” artificial intelligence. Wysa is an A.I powered mental health app designed by Touchkin eServices, Aggarwal’s company that currently maintains […]

It’s hard enough to talk about your feelings to a person; Jo Aggarwal, the founder and CEO of Wysa, is hoping you’ll find it easier to confide in a robot. Or, put more specifically, “emotionally intelligent” artificial intelligence.

Wysa is an A.I powered mental health app designed by Touchkin eServices, Aggarwal’s company that currently maintains headquarters in Bangalore, Boston and London. Wysa is something like a chatbot that can respond with words of affirmation, or guide a user through one of 150 different therapeutic techniques.

Wysa is Aggarwal’s second venture. The first was an elder care company that failed to find market fit, she says. Aggarwal found herself falling into a deep depression, from which, she says, the idea of Wysa was born in 2016. 

In March, Wysa became one of 17 apps in the Google Assistant Investment Program, and in May, closed a Series A funding round of $5.5 million led by Boston’s W Health Ventures, the Google Assistant Investment Program, pi Ventures and Kae Capital. 

Wysa has raised a total of $9 million in funding, says Aggarwal, and the company has 60 full-time employees and about three million users. 

The ultimate goal, she says, is not to diagnose mental health conditions. Wysa is largely aimed at people who just want to vent. Most Wysa users are there to improve their sleep, anxiety or relationships, she says. 

“Out of the 3 million people that use Wysa, we find that only about 10% actually need a medical diagnosis,” says Aggarwal. If a user’s conversations with Wysa equate with high scores on traditional depression questionnaires like the PHQ-9 or the anxiety disorder questionnaire GAD-7 Wysa will suggest talking to a human therapist. 

Naturally, you don’t need to have a clinical mental health diagnosis to benefit from therapy. 

Wysa isn’t intended to be a replacement, says Aggarwal  (whether users view it as a replacement remains to be seen) but an additional tool that a user can interact with on a daily basis. 

Mental health startups are raising spirits and venture capital

“60 percent of the people who come and talk to Wysa need to feel heard and validated, but if they’re given techniques of self help, they can actually work on it themselves and feel better,” Aggarwal continues. 

Wysa’s approach has been refined through conversations with users and through input from therapists, says Aggarwal. 

For instance, while having a conversation with a user, Wysa will first categorize their statements and then assign a type of therapy, like cognitive behavioral therapy or acceptance and commitment therapy, based on those responses. It would then select a line of questioning or therapeutic technique written ahead of time by a therapist and begin to converse with the user. 

Wysa, says Aggarwal, has been gleaning its own insights from over 100 million conversations that have unfolded this way. 

“Take for instance a situation where you’re angry at somebody else. Originally our therapists would come up with a technique called the empty chair technique where you’re trying to look at it from the other person’s perspective. We found that when a person felt powerless or there were trust issues, like teens and parents, the techniques the therapists were giving weren’t actually working,” she says. 

“There are 10,000 people facing trust issues who are actually refusing to do the empty chair exercise. So we have to find another way of helping them. These insights have built Wysa.”

Although Wysa has been refined in the field, research institutions have played a role in Wysa’s ongoing development. Pediatricians at the University of Cincinnati helped develop a module specifically targeted towards COVID-19 anxiety. There are also ongoing studies of Wysa’s ability to help people cope with mental health consequences from chronic pain, arthritis, and diabetes at The Washington University in St. Louis, and The University of New Brunswick. 

Still, Wysa has had several tests in the real world. In 2020, the government of Singapore licensed Wysa, and provided the service for free to help cope with the emotional fallout of the coronavirus pandemic. Wysa is also offered through the health insurance company Aetna as a supplement to Aetna’s Employee Assistance Program. 

4 strategies for building a digital health unicorn

The biggest concern about mental health apps, naturally, is that they might accidentally trigger an incident, or mistake signs of self harm. To address this, the UK’s National Health Service (NHS) offers specific compliance standards. Wysa is compliant with the NHS’  DCB0129 standard for clinical safety, the first AI-based mental health app to earn the distinction. 

To meet those guidelines, Wysa appointed a clinical safety officer, and was required to create “escalation paths” for people who show signs of self harm.

Wysa, says Aggarwal, is also designed to flag responses to self-harm, abuse, suicidal thoughts or trauma. If a user’s responses fall into those categories Wysa will prompt the user to call a crisis line.

In the US, the Wysa app that anyone can download, says Aggarwal, fits the FDA’s definition of a general wellness app or a “low risk device.” That’s relevant because, during the pandemic, the FDA has created guidance to accelerate distribution of these apps. 

Still, Wysa may not perfectly categorize each person’s response. A 2018 BBC investigation, for instance, noted that the app didn’t appear to appreciate the severity of a proposed underage sexual encounter. Wysa responded by updating the app to handle more instances of coercive sex. 

Aggarwal also notes that Wysa contains a manual list of sentences, often containing slang, that they know the AI won’t catch or accurately categorize as harmful on its own. Those are manually updated to ensure that Wysa responds appropriately. “Our rule is that [the response] can be 80%, appropriate, but 0% triggering,” she says. 

In the immediate future, Aggarwal says the goal is to become a full-stack service. Rather than having to refer patients who do receive a diagnosis to Employee Assistant Programs (as the Aetna partnership might) or outside therapists, Wysa aims to build out its own network of mental health suppliers. 

On the tech side they’re planning expansion into Spanish, and will start investigating a voice-based system based on guidance from the Google Assistant Investment Fund. 

 

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.