More than 4 in 10 adults in the UK are happy to use ChatGPT for their mental health support, according to new research led by Bournemouth University. The study surveyed nearly 31,000 adults in 35 countries about their use of Artificial Intelligence large language models such as ChatGPT, and found a striking level of trust in generative tools taking on roles traditionally filled by humans. The work, published in the journal AI and Society, explored how comfortable people are with delegating responsibilities in areas such as counselling, teaching, healthcare and friendship.
In the area of mental health, 41% of participants from the UK, and 61% globally, said that they would be happy to using Artificial Intelligence for counselling services. The researchers suggest that in the UK this willingness may be linked to long waiting times to access mental health services, which can push people toward digital tools for quicker support. At the same time, they caution that when some of these tools were tested, the language was found to be vague and confusing because developers avoid giving direct diagnoses, so they are not considered a substitute for qualified health professionals. Familiarity with NHS chatbots using similar technology may also be normalising the use of generative Artificial Intelligence apps such as ChatGPT for mental health care.
Trust extended into education and healthcare, although with more concern from the research team. One quarter of UK adults would be happy to delegate the role of teaching their children to Artificial Intelligence, and a quarter of people in the UK and half of everyone surveyed globally said that they would trust Artificial Intelligence to carry out the role of a teacher. The researchers worry about unknown long term effects on children’s memory and cognitive functions, including whether heavy reliance on tools and search engines could alter brain structures such as the hippocampus that are important for spatial awareness and learning. In medicine, 45% of all respondents and 25% in the UK said that they would trust Artificial Intelligence to carry out the role of their doctor, with higher numbers in countries where healthcare is more expensive and harder to access, although the team warned that algorithms designed to keep users chatting could be harmful if they do not quickly direct people to crisis services when needed.
The strongest readiness to trust generative tools appeared in the domain of social connection. Three quarters of people surveyed said they would use an Artificial Intelligence chat tool as a companion and a friend, and over three quarters of people globally and over half of people in the UK said they would talk to ChatGPT as a companion. Participants seem to experience a sense of empathy because generative models are designed to adapt tone to the user and can remember past conversations, which can make interactions feel private, familiar and non judgemental. The researchers conclude that as Artificial Intelligence moves deeper into everyday life, societies need greater awareness of how generative tools work and of their limitations, and they call for particular caution before allowing them to take over educational roles when the long term effects on memory and learning are still unknown.
