Advertisment

The future of Artificial Intelligence in Mental Health

author-image
DQINDIA Online
New Update
Artificial Intelligence

By: Jyotsana Aggarwal & Smriti Joshi, Wysa

Advertisment

The mental image of AI has always been that of a non-sentient being conversing meaningfully with us, starting as an assistant, and potentially taking over control from humans. In reality, AI is on our phone already, not just as Siri or Allo, but in map navigation, image correction, face recognition, deciding which ads we see and what products are recommended to us.

We also live in a world where one in four people suffer from mental disorders, making it one of the leading causes of disability and ill-health. Technology has become an addiction, and it often blamed as the cause of rising mental health issues.

Could Artificial intelligence help it become the cure instead? We at Touchkin have been pushing the boundaries at the intersection between AI and mental health, with some very interesting results.

Advertisment

Artificial intelligence is a relatively new field - the term was first coined in 1950s by John McCarthy at a computer science conference. The first chatbot written in 1966 by Joseph Weizenbaum at MIT, and co-incidentally was a Rogerian ‘therapist-bot’ called Eliza. People could spend hours talking to Eliza, but it didn’t really improve their mental health.

Since then, Cognitive Science has emerged as a new a field is at the intersection of computer science and psychology. AI engines today not only simulate human conversation, they listen, learn, plan, and problem solve.

Healthcare, normally not an early adopter of new technologies, has seen some of the greatest advances in artificial intelligence. AI engines are today assisting doctors improve diagnoses, pick the right treatment and monitor care. Bernard J. Tyson, CEO of Kaiser Permanente told Forbes magazine "No physician today should be practicing without artificial intelligence assisting in their practice. It’s just impossible (otherwise) to pick up on patterns, to pick up on trends, to really monitor care."

Advertisment

So how is AI helping us improve mental health?

A lot of work in AI, and more specifically machine learning, has been for early identification of mental health issues.  Clinically, a person’s tone, word choice, and the length of a phrase are all crucial cues to understanding what’s going on in someone’s mind.

Harvard University and University of Vermont researchers are  integrating machine learning tools and Instagram to improve depression screening. Using color analysis, metadata, and algorithmic face detection, they were able to reach 70 percent accuracy in detecting signs of depression. The research wing at IBM is using transcripts and audio from psychiatric interviews, coupled with machine learning techniques, to find patterns in speech to help clinicians accurately predict and monitor psychosis, schizophrenia, mania, and depression. A research, led by John Pestian, a professor at Cincinnati Children's Hospital Medical Centre showed that machine learning is up to 93% accurate in identifying a suicidal person.

Advertisment

At Touchkin we are using machine learning to detect depression in high risk groups, for instance people with diabetes. We use the movement of the phone as a proxy for people’s behavior and are able to detect depression at over 90% accuracy even in semi-rural  settings with basic phones. The beauty of this model is that it is passive, and works with people who don’t speak English or engage actively on social media when they are depressed.

In this, AI can perform better than humans.  However, early diagnosis is only a small part of the solution.

The stigma associated with seeking mental health treatment often leaves people silently suffering, unable to acknowledge or accept the illness in the presence of another human being.  Even when they are made aware that their condition is treatable, there are many barriers to getting quality mental healthcare, from searching for a provider who practices in a user’s geographical location to screening multiple potential therapists in order to find someone you feel comfortable speaking with. In India, we have a few thousand qualified psychologists to support over hundred and fifty million people suffering from mental disorders.

Advertisment

Our trials at Touchkin showed that less than 5% of the people recommended therapy after being identified with depression actually went on to seek treatment for it.

For a student worried about exam results, a new mother dealing with post partum depression, or a successful businessman with a gambling addiction, talking to their families is often not an option.  Even if they share how they feel, it is likely that they will be judged rather than supported.

Computers are still far from becoming doctors or therapists in terms of both technical capability and legal frameworks, but with AI they make great coaches that help people to learn mental health skills in a safe, yet personalized environment. A bot does not judge, and could be the first step in helping them find support.

Advertisment

Research from the Tele Mental Health Institute shows that in the initial stages, conversations about psychological issues with an AI chatbot are perceived to be at least as engaging, if not more so than those with a real person. To test this, the team at Touchkin created Wysa (a nod to the 1960’s therapist-bot Eliza).

Wysa is an AI based emotionally intelligent penguin. It listens, chats with and helps users build mental resilience by learning skills like reframing negative thoughts and mindfulness.  She is a non-judgemental, empathetic resource with whom they can share just about anything, anytime, and anonymously. She can be trained in different languages and cultural contexts, establishing trust and connect across socio-economic strata.

The results were astounding. In less than three months, Wysa had crossed a million conversations with fifty thousand users. Over five hundred people had written in to say how much it had helped them with a mental health problem, and while it was clearly new and learning, it was better than any other option they had. Some of these users had been suicidal, others lived with PTSD, social anxiety, depression, or bipolar. Practitioners started offering Wysa between therapy sessions as a way of practicing skills.

As with physical health, it is likely that AI will overtake humans in accuracy of diagnosis and prescription relatively quickly.  We don’t see AI bots replacing therapists anytime soon, but they could make them far more effective, and increase their reach multi-fold.

Most importantly, for millions of people who feel alone and don’t have a support system of friends and therapists around them, artificial intelligence may well build resilience, provide support and save lives.

ai ibm wysa
Advertisment