FINETUNING MENTAL HEALTH SUPPORT
While tools such as AI can facilitate mental health support, experts stressed that the technology cannot be a substitute for the nuanced understanding of human professionals.
Citing various considerations, Dr Prem said: “I think that culture specificity (has not been) learnt yet. Whether it’s medicine, psychology, or any kind of clinical work, we need to be culturally sensitive to specific needs of different people.”
Dr Maria Hennessy, associate professor of clinical psychology at James Cook University, said: “AI and chatbots are probably more a dream than a reality at the moment, in terms of how effective they can be for our own individual care.”
The technology is developing and there is still a long way to go, she told CNA’s Asia First.
“At best, they can give basic information and communication, but they don’t yet have that capacity to give you that empathy and that sense of rapport that you look for in a good mental health clinician,” she added.
“And it’s that empathy and rapport that actually accounts for about 50 per cent of the effectiveness of what we do.”
While more work is needed to implement AI in mental healthcare practices, particularly regarding data privacy and ethics, the use of technology is a step forward in encouraging proactive mental health management, said observers.
“They are there to provide basic, accessible information for people who otherwise might have difficulty finding it. But they certainly don’t replace talking to a clinician,” said Dr Hennessy.
“They’ve got a lot of potential, but we are nowhere near them yet.”