lcp
We have detected you are using Internet Explorer. To provide the best and most secure experience, please use a modern browser as we do not support Internet Explorer.

Bots and ‘digital humans’ in mental healthcare – the next wave of technology?

05 February 2019 - Julia Reynolds

Are you having difficulty getting your head around the use of technology in mental health care? The idea that computer programs might interact with humans in a humanlike way has been around for decades but ideas about implementation are getting more and more sophisticated.

 

Early work dating from the 1960’s found that people were more likely to disclose suicidality when interacting with a computer-based assessment than in face to face sessions with clinicians. 

 

This era also saw the launch of the famous ELIZA bot, designed to mimic a Rogerian therapist. It works by recognizing keywords in text and responding with pieces of text that simulate active listening. ELIZA was designed to show that robot-human ‘conversations’ had limited meaning but some have found ELIZA to be convincing – perhaps demonstrating that humans tend to seek meaning in uncertain interactions. ELIZA is still freely available online –  http://psych.fullerton.edu/mbirnbaum/psych101/Eliza.htm

 

Bots are certainly becoming more sophisticated. It has been estimated that there are 161 terms to describe bots and related programs. Although there is overlap between these terms, there are some important differences.

 

What can they understand and produce?

Some programs are designed to recognize patterns in human input and then select and activate standardized responses, much like the ELIZA bot. They can’t create new responses to input that they do not recognize.

 

More sophisticated programs use Artificial Intelligence and natural language processing. They can learn from their interactions with humans and develop new responses. To see how this works in practice have a play at Cleverbot, a well-known general conversation AI platform.

 

What do they look like?

Programs interfaces range from basic text ‘conversations’ to ‘embodied’ platforms that look like humans and receive information from users’ cameras and microphones to interpret emotion and adjust their responses. Some ‘digital humans’ look quite realistic: https://www.soulmachines.com.

 

How are they used in mental health care?

Researchers are exploring the potential for these tools to provide assessments, interventions and administrative support. See here for an example of assessments conducted by the SimSesi tool.

Virtual humans are also being used in service provider training – as in this example of a virtual patient in a psychiatry training program: https://www.chatbots.org/research/news/virtual_humans_virtual_reality_patient_mental_clinicians/

 

More research and development is needed before bots will be ready for routine use, but this field is developing rapidly. If you haven’t explored this technology before, you might like to grab a coffee, follow some of the links in this post and drop in to a bot for a chat!

Julia Reynolds
Julia Reynolds

Julia Reynolds MPsych(Clin), MAPS, is Clinical Psychologist and e-hub Clinical Services Manager, Centre for Mental Health Research, ANU.

Related Tags
Related Categories

If you need help, please call

  • Lifeline- 13 11 14
  • BeyondBlue - 1300 22 4636
  • Suicide Call Back Service - 1300 659 467
GET HELP
Related News
Are you safe to work? – reframing physical and emotional factors in fitness to work

Dr. Phoebe Holdenson Kimura

Have you ever been on your way to work and asked yourself “I don’t really feel well . . . should I really be working clinically today” – and yet still turned up and completed a full day’s work?

4 mins READ
Staying in the workforce is good for my wellbeing

Tessa Moriarty

*In April 2021, approximately 619,000 older Australians (aged 65 and over) were employed in the labour force", and at 66 years, I’m proud to be included in this statistic. By Tessa Moriarty

Dealing with Dementia

Dr Jan Orman

For as long as I have been in practice (and that’s a long time!) I have done my best to avoid looking after old people.