What 100 suicide notes taught us about creating more empathetic chatbots

0

[ad_1]

While the art of conversation in machines is limited, there are improvements with every iteration. As machines are developed to navigate complex conversations, there will be technical and ethical challenges in how they detect and respond to sensitive human issues.

Our work involves building chatbots for a range of uses in health care. Our system, which incorporates multiple algorithms used inartificial intelligence (AI) and natural language processing, has been in development at the Australian e-Health Research Centre since 2014.

The system has generated several chatbot apps which are being trialed among selected individuals, usually with an underlying medical condition or who require reliable health-related information.

They include HARLIE for Parkinson’s disease and Autism Spectrum Disorder, Edna for people undergoing genetic counselling, Dolores for people living with chronic pain, and Quin for people who want to quit smoking.

Research has shown those people with certain underlying medical conditions are more likely to think about suicide than the general public. We have to make sure our chatbots take this into account.