Minister warns of ‘danger’ of unsuitable mental health support amid ChatGPT use
A health minister has warned of the “immense danger” of unsuitable mental health care amid a growing trend of people using AI tools like ChatGPT for support.
Baroness Merron did say, however, that the Government is looking at how the digital and online world can be used to support people with their mental health.
Her warning came as Conservative peer Baroness Blackwood of North Oxford asked about “the growing trend that’s being reported of those who are unable to access affordable mental health care and are therefore turning to AI platforms, such as Grok and ChatGPT, which are of course unverified for this use”.
Lady Merron told the House of Lords: “It is very important that people use the right support, otherwise there is immense danger of course in going for what is, perhaps, less suitable.
“We haven’t to my knowledge made a particular assessment, but I will pick up the point.
“On a more positive side, I would say we are looking at what support we can develop in a digital and online sense to support people, not just on waiting lists, but to prevent ill health and assist in their recovery.”
A YouGov survey last year found that 31% of 18-24 year olds would be comfortable talking about their mental health concerns with a confidential AI chatbot.
In March, there were 16.7 million posts on TikTok about using ChatGPT as a therapist, according to The Times.
Meanwhile, NHS waiting times for mental health support from a trained professional can be up to several years long, and private therapy is unaffordable for many.
Dr Jenna Glover, the chief clinical officer of mindfulness app Headspace, which launched an AI companion tool in the UK last year, argued that AI can help “bridge gaps” in a “broken” mental health care system.
However, she said Headspace’s AI was a “sub-clinical support tool” that does not provide mental health guidance, advice or diagnoses, and that humans still have a “central role” in providing care.
Many people that use chatbots to support their mental health say that they find it helpful, and a study published in PLOS Mental Health in February found that people are often unable to tell if therapy text responses were AI-generated or from a trained professional.
However, there are concerns that AI tools lack the human connection that can help drive improvements, that they have been known to give dangerous advice, and that they have privacy and data risks.
Baroness Merron acknowledged that waiting times for mental health support are too long, but assured peers that improving services is a “key priority” for the Government.
She said: “It is unacceptable that too many people are waiting too long for mental health care.
“Mental health is a key priority for this Government and we are already transforming services including through the introduction new models for community-based care, recruiting 8,500 mental health workers and expanding mental health support teams so that we can provide access to specialist mental health professionals in every school.”
Copyright (c) PA Media Ltd. 2024, All Rights Reserved. Picture (c) Lauren Hurley / PA.