Are you having difficulty getting your head around the use of technology in mental health care? The idea that computer programs might interact with humans in a humanlike way has been around for decades but ideas about implementation are getting more and more sophisticated.
Early work dating from the 1960’s found that people were more likely to disclose suicidality when interacting with a computer-based assessment than in face to face sessions with clinicians.
This era also saw the launch of the famous ELIZA bot, designed to mimic a Rogerian therapist. It works by recognizing keywords in text and responding with pieces of text that simulate active listening. ELIZA was designed to show that robot-human ‘conversations’ had limited meaning but some have found ELIZA to be convincing – perhaps demonstrating that humans tend to seek meaning in uncertain interactions. ELIZA is still freely available online – http://psych.fullerton.edu/mbirnbaum/psych101/Eliza.htm
Bots are certainly becoming more sophisticated. It has been estimated that there are 161 terms to describe bots and related programs. Although there is overlap between these terms, there are some important differences.
Some programs are designed to recognize patterns in human input and then select and activate standardized responses, much like the ELIZA bot. They can’t create new responses to input that they do not recognize.
More sophisticated programs use Artificial Intelligence and natural language processing. They can learn from their interactions with humans and develop new responses. To see how this works in practice have a play at Cleverbot, a well-known general conversation AI platform.
Programs interfaces range from basic text ‘conversations’ to ‘embodied’ platforms that look like humans and receive information from users’ cameras and microphones to interpret emotion and adjust their responses. Some ‘digital humans’ look quite realistic: https://www.soulmachines.com.
Researchers are exploring the potential for these tools to provide assessments, interventions and administrative support. See here for an example of assessments conducted by the SimSesi tool.
Virtual humans are also being used in service provider training – as in this example of a virtual patient in a psychiatry training program: https://www.chatbots.org/research/news/virtual_humans_virtual_reality_patient_mental_clinicians/
More research and development is needed before bots will be ready for routine use, but this field is developing rapidly. If you haven’t explored this technology before, you might like to grab a coffee, follow some of the links in this post and drop in to a bot for a chat!
Julia Reynolds MPsych(Clin), MAPS, is Clinical Psychologist and e-hub Clinical Services Manager, Centre for Mental Health Research, ANU.
Have you ever been on your way to work and asked yourself “I don’t really feel well . . . should I really be working clinically today” – and yet still turned up and completed a full day’s work?
*In April 2021, approximately 619,000 older Australians (aged 65 and over) were employed in the labour force", and at 66 years, I’m proud to be included in this statistic. By Tessa Moriarty
For as long as I have been in practice (and that’s a long time!) I have done my best to avoid looking after old people.