Can AI therapists really be an alternative to human help?

πŸš€ Quick Resource Guide

For those looking to take action on what we’re discussing, we’ve curated some excellent resources at https://zzjieluu.com. These products have been tested and recommended by our team.

Eleanor Lawrie profile image

Social affairs reporter

BBC A treated image showing two hands; at the top is a human hand, and below is a robotic/digital looking handBBC

Listen to this article

“Whenever I was struggling, if it was going to be a really bad day, I could then start to chat to one of these bots, and it was like [having] a cheerleader, someone who’s going to give you some good vibes for the day.


πŸ’Ž Recommended Solution

Quality Product 1

Why we recommend this: Quality Product 1

Explore similar products that can enhance your experience.

“I’ve got this encouraging external voice going – ‘right - what are we going to do [today]?’ Like an imaginary friend, essentially.”

For months, Kelly spent up to three hours a day speaking to online “chatbots” created using artificial intelligence (AI), exchanging hundreds of messages.

At the time, Kelly was on a waiting list for traditional NHS talking therapy to discuss issues with anxiety, low self-esteem and a relationship breakdown.

She says interacting with chatbots on character.ai got her through a really dark period, as they gave her coping strategies and were available for 24 hours a day.

“I’m not from an openly emotional family - if you had a problem, you just got on with it.

πŸ›οΈ Tools & Equipment

For optimal results with what we’ve covered, consider this professional solution:

Quality Product 3

Key Features: Quality Product 3

Shop now at https://zzjieluu.com β€’ Free shipping available

“The fact that this is not a real person is so much easier to handle.”

People around the world have shared their private thoughts and experiences with AI chatbots, even though they are widely acknowledged as inferior to seeking professional advice. Character.ai itself tells its users: “This is an AI chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice.”

But in extreme examples chatbots have been accused of giving harmful advice.

Character.ai is currently the subject of legal action from a mother whose 14-year-old son took his own life after reportedly becoming obsessed with one of its AI characters. According to transcripts of their chats in court filings he discussed ending his life with the chatbot. In a final conversation he told the chatbot he was “coming home” - and it allegedly encouraged him to do so “as soon as possible”.

Character.ai has denied the suit’s allegations.


πŸ’Ž Recommended Solution

Quality Product 5

Why we recommend this: Quality Product 5

Explore similar products that can enhance your experience.

And in 2023, the National Eating Disorder Association replaced its live helpline with a chatbot, but later had to suspend it over claims the bot was recommending calorie restriction.

![Bloomberg/ Getty Images A hand holding the character.ai app on a smartphone

](https://ichef.bbci.co.uk/news/480/cpsprodpb/6d62/live/a2c83b20-34bd-11f0-96c3-cf669419a2b0.png.webp)Bloomberg/ Getty Images

People around the world have used AI chatbots

At the same time, AI has revolutionised healthcare in many ways, including helping to screen, diagnose and triage patients. There is a huge spectrum of chatbots, and about 30 local NHS services now use one called Wysa.

Product Highlight

Quality Product 7

Quality Product 7

πŸ‘‰ Browse our complete catalog for more options.

Experts express concerns about chatbots around potential biases and limitations, lack of safeguarding and the security of users’ information. But some believe that if specialist human help is not easily available, chatbots can be a help. So with NHS mental health waitlists at record highs, are chatbots a possible solution?

An ‘inexperienced therapist’

Character.ai and other bots such as Chat GPT are based on “large language models” of artificial intelligence. These are trained on vast amounts of data – whether that’s websites, articles, books or blog posts - to predict the next word in a sequence. From here, they predict and generate human-like text and interactions.

The way mental health chatbots are created varies, but they can be trained in practices such as cognitive behavioural therapy, which helps users to explore how to reframe their thoughts and actions. They can also adapt to the end user’s preferences and feedback.

Hamed Haddadi, professor of human-centred systems at Imperial College London, likens these chatbots to an “inexperienced therapist”, and points out that humans with decades of experience will be able to engage and “read” their patient based on many things, while bots are forced to go on text alone.

πŸ›οΈ Tools & Equipment

For optimal results with what we’ve covered, consider this professional solution:

Quality Product 9

Key Features: Quality Product 9

Shop now at https://zzjieluu.com β€’ Free shipping available

“They [therapists] look at various other clues from your clothes and your behaviour and your actions and the way you look and your body language and all of that. And it’s very difficult to embed these things in chatbots.”

Another potential problem, says Prof Haddadi, is that chatbots can be trained to keep you engaged, and to be supportive, “so even if you say harmful content, it will probably cooperate with you”. This is sometimes referred to as a ‘Yes Man’ issue, in that they are often very agreeable.

And as with other forms of AI, biases can be inherent in the model because they reflect the prejudices of the data they are trained on.

Prof Haddadi points out counsellors and psychologists don’t tend to keep transcripts from their patient interactions, so chatbots don’t have many “real-life” sessions to train from. Therefore, he says they are not likely to have enough training data, and what they do access may have biases built into it which are highly situational.

“Based on where you get your training data from, your situation will completely change.


πŸ’Ž Recommended Solution

Quality Product 11

Why we recommend this: Quality Product 11

Explore similar products that can enhance your experience.

“Even in the restricted geographic area of London, a psychiatrist who is used to dealing with patients in Chelsea might really struggle to open a new office in Peckham dealing with those issues, because he or she just doesn’t have enough training data with those users,” he says.

PA Media A woman looking at her phonePA Media

Philosopher Dr Paula Boddington, who has written a textbook on AI Ethics, agrees that in-built biases are a problem.

“A big issue would be any biases or underlying assumptions built into the therapy model.”

“Biases include general models of what constitutes mental health and good functioning in daily life, such as independence, autonomy, relationships with others,” she says.


πŸ’Ž Recommended Solution

Quality Product 13

Why we recommend this: Quality Product 13

Explore similar products that can enhance your experience.

Lack of cultural context is another issue – Dr Boddington cites an example of how she was living in Australia when Princess Diana died, and people did not understand why she was upset.

“These kinds of things really make me wonder about the human connection that is so often needed in counselling,” she says.

“Sometimes just being there with someone is all that is needed, but that is of course only achieved by someone who is also an embodied, living, breathing human being.”

Kelly ultimately started to find responses the chatbot gave unsatisfying.

“Sometimes you get a bit frustrated. If they don’t know how to deal with something, they’ll just sort of say the same sentence, and you realise there’s not really anywhere to go with it.” At times “it was like hitting a brick wall”.

Featured Product Spotlight

Quality Product 15

Quality Product 15

This type of product perfectly complements the concepts we’ve discussed. View our full collection for more professional solutions.

“It would be relationship things that I’d probably previously gone into, but I guess I hadn’t used the right phrasing […] and it just didn’t want to get in depth.”

A Character.AI spokesperson said “for any Characters created by users with the words ‘psychologist’, ‘therapist,’ ‘doctor,’ or other similar terms in their names, we have language making it clear that users should not rely on these Characters for any type of professional advice”.

‘It was so empathetic’

For some users chatbots have been invaluable when they have been at their lowest.

Nicholas has autism, anxiety, OCD, and says he has always experienced depression. He found face-to-face support dried up once he reached adulthood: “When you turn 18, it’s as if support pretty much stops, so I haven’t seen an actual human therapist in years.”


πŸ’Ž Recommended Solution

Quality Product 1

Why we recommend this: Quality Product 1

Explore similar products that can enhance your experience.

He tried to take his own life last autumn, and since then he says he has been on a NHS waitlist.

“My partner and I have been up to the doctor’s surgery a few times, to try to get it [talking therapy] quicker. The GP has put in a referral [to see a human counsellor] but I haven’t even had a letter off the mental health service where I live.”

While Nicholas is chasing in-person support, he has found using Wysa has some benefits.

“As someone with autism, I’m not particularly great with interaction in person. [I find] speaking to a computer is much better.”

Getty Wes Streeting speaking in front of a sign about cutting waiting timesGetty

Product Highlight

Quality Product 3

Quality Product 3

πŸ‘‰ Browse our complete catalog for more options.

The government has pledged to recruit 8,500 more mental health staff to cut waiting lists

The app allows patients to self-refer for mental health support, and offers tools and coping strategies such as a chat function, breathing exercises and guided meditation while they wait to be seen by a human therapist, and can also be used as a standalone self-help tool.

Wysa stresses that its service is designed for people experiencing low mood, stress or anxiety rather than abuse and severe mental health conditions. It has in-built crisis and escalation pathways whereby users are signposted to helplines or can send for help directly if they show signs of self-harm or suicidal ideation.

For people with suicidal thoughts, human support on the free Samaritans helpline is available 247.

Nicholas also experiences sleep deprivation, so finds it helpful if support is available at times when friends and family are asleep.


πŸ’Ž Recommended Solution

Quality Product 5

Why we recommend this: Quality Product 5

Explore similar products that can enhance your experience.

“There was one time in the night when I was feeling really down. I messaged the app and said ‘I don’t know if I want to be here anymore.’ It came back saying ‘Nick, you are valued. People love you’.

“It was so empathetic, it gave a response that you’d think was from a human that you’ve known for years […] And it did make me feel valued.”

His experiences chime with a recent study by Dartmouth College researchers looking at the impact of chatbots on people diagnosed with anxiety, depression or an eating disorder, versus a control group with the same conditions.

After four weeks, bot users showed significant reductions in their symptoms – including a 51% reduction in depressive symptoms - and reported a level of trust and collaboration akin to a human therapist.

Despite this, the study’s senior author commented there is no replacement for in-person care.

πŸ›οΈ Tools & Equipment

For optimal results with what we’ve covered, consider this professional solution:

Quality Product 7

Key Features: Quality Product 7

Shop now at https://zzjieluu.com β€’ Free shipping available

‘A stop gap to these huge waiting lists’

Aside from the debate around the value of their advice, there are also wider concerns about security and privacy, and whether the technology could be monetised.

“There’s that little niggle of doubt that says, ‘oh, what if someone takes the things that you’re saying in therapy and then tries to blackmail you with them?’,” says Kelly.

Psychologist Ian MacRae specialises in emerging technologies, and warns “some people are placing a lot of trust in these [bots] without it being necessarily earned”.

“Personally, I would never put any of my personal information, especially health, psychological information, into one of these large language models that’s just hoovering up an absolute tonne of data, and you’re not entirely sure how it’s being used, what you’re consenting to.”


πŸ’Ž Recommended Solution

Quality Product 9

Why we recommend this: Quality Product 9

Explore similar products that can enhance your experience.

“It’s not to say in the future, there couldn’t be tools like this that are private, well tested […] but I just don’t think we’re in the place yet where we have any of that evidence to show that a general purpose chatbot can be a good therapist,” Mr MacRae says.

Wysa’s managing director, John Tench, says Wysa does not collect any personally identifiable information, and users are not required to register or share personal data to use Wysa.

“Conversation data may occasionally be reviewed in anonymised form to help improve the quality of Wysa’s AI responses, but no information that could identify a user is collected or stored. In addition, Wysa has data processing agreements in place with external AI providers to ensure that no user conversations are used to train third-party large language models.”

AFP/ Getty Images A man walks past an NHS signage AFP/ Getty Images

There is a huge spectrum of chatbots, and about 30 local NHS services now use one called Wysa

πŸ›οΈ Tools & Equipment

For optimal results with what we’ve covered, consider this professional solution:

Quality Product 11

Key Features: Quality Product 11

Shop now at https://zzjieluu.com β€’ Free shipping available

Kelly feels chatbots cannot currently fully replace a human therapist. “It’s a wild roulette out there in AI world, you don’t really know what you’re getting.”

“AI support can be a helpful first step, but it’s not a substitute for professional care,” agrees Mr Tench.

And the public are largely unconvinced. A YouGov survey found just 12% of the public think AI chatbots would make a good therapist.

More from InDepth

Britain’s nursery problem: Parents still face ‘childcare deserts’

Product Highlight

Quality Product 13

Quality Product 13

πŸ‘‰ Browse our complete catalog for more options.

The influencers who want the world to have more babies - and say the White House is on their side

The English neighbourhood that claims to hold the secret to fixing the NHS

But with the right safeguards, some feel chatbots could be a useful stopgap in an overloaded mental health system.

John, who has an anxiety disorder, says he has been on the waitlist for a human therapist for nine months. He has been using Wysa two or three times a week.

“There is not a lot of help out there at the moment, so you clutch at straws.”

Product Highlight

Quality Product 15

Quality Product 15

πŸ‘‰ Browse our complete catalog for more options.

”[It] is a stop gap to these huge waiting lists… to get people a tool while they are waiting to talk to a healthcare professional.”

If you have been affected by any of the issues in this story you can find information and support on the BBC Actionline website here.

Top image credit: Getty

Grey presentational line

During May, the BBC is sharing stories and tips on how to support your mental health and wellbeing. Visit bbc.co.uk/mentalwellbeing to find out more.

πŸ›οΈ Tools & Equipment

For optimal results with what we’ve covered, consider this professional solution:

Quality Product 1

Key Features: Quality Product 1

Shop now at https://zzjieluu.com β€’ Free shipping available

Grey presentational line

Health

Artificial intelligence

NHS

Mental health


πŸ’Ž Recommended Solution

Quality Product 3

Why we recommend this: Quality Product 3

Explore similar products that can enhance your experience.

Technology

Barber shop gets men talking about mental health2 hrs agoBirmingham & Black Country

Woman’s app to help children with mental health3 hrs agoSuffolk

More from the BBC

2 hrs agoStethoscope on top of a pile of folders.Hundreds of Manx patients affected by data errorMore than 400 Manx residents have had their home addresses changed incorrectly on NHS records.2 hrs agoIsle of Man

πŸ›οΈ Tools & Equipment

For optimal results with what we’ve covered, consider this professional solution:

Quality Product 5

Key Features: Quality Product 5

Shop now at https://zzjieluu.com β€’ Free shipping available

3 hrs agoKaylee Davidson-Olley, wearing a pink shirt which says NHS Blood and Transplant, and wearing pink heart shaped earrings holds up a black and white picture which shows her being held by her late mother, Carol. ‘Organs are needed here, not in heaven’Kaylee Davidson-Olley says families should not be allowed to overturn a patient’s desire to donate.3 hrs agoWear

3 hrs agoGreg Thompson from Cambourne Men’s Shed & Allan Scott Davies shaking hands. Greg on the left is wearing a black jacket over blue jumper and Allan Scott Davies is wearing a brown fleece. Behind them is the white body of the replica mill. They are standing in a field. Men’s Shed group restore 1940s replica windmillThe 80-year-old, fully-working miniature is a replica of one of the UK’s oldest windmills. 3 hrs agoCambridgeshire

3 hrs agoClinical nurse specialist Vicki Bailey stands in front of some bright green shrubs. She is wearing a red polo shirt and has long ginger hair. Women-only mental health service opens for veteransIt is hoped the service for military personnel will help those “struggling to come forward”.3 hrs ago


Take Action Today

Ready to implement what you’ve learned? Our team has curated the best tools and products to help you succeed. Explore our solutions and take your projects to the next level.

Featured Product Spotlight

Quality Product 7

Quality Product 7

This type of product perfectly complements the concepts we’ve discussed. View our full collection for more professional solutions.


Related Resources & Products

πŸ”— Our Website: https://zzjieluu.com πŸ›οΈ Featured Products:

  • Product 1 Quality Product 1
  • Product 2 Quality Product 3
  • Product 3 Quality Product 5 πŸ“Ί Related Videos: Subscribe to our YouTube channel for more insights πŸ’‘ Need Help?: Contact our expert team for personalized solutions πŸš€ Stay Updated: Follow us for the latest industry trends and products

πŸ“š Featured Products & Recommendations

Discover our carefully selected products that complement this article’s topics:

πŸ›οΈ Featured Product 1: Regency Hill Montebello Antique Floor Lamp

Regency Hill Montebello Antique Floor Lamp

Professional Regency Hill Montebello Antique Floor Lamp - High quality product available now

Key Features:

  • Professional-grade quality and durability
  • Easy to use with excellent results
  • Suitable for both beginners and professionals

πŸ”— View Product Details & Purchase

πŸ’‘ Need Help Choosing? Contact our expert team for personalized product recommendations!

Remaining 0% to read
All articles, information, and images displayed on this site are uploaded by registered users (some news/media content is reprinted from network cooperation media) and are for reference only. The intellectual property rights of any content uploaded or published by users through this site belong to the users or the original copyright owners. If we have infringed your copyright, please contact us and we will rectify it within three working days.