BMJ Warns of Risky Emotional Bonds With AI Chatbots, Urges Clinician Screening
The report pushes healthcare to assess chatbot use as a risk factor to protect users’ long-term wellbeing.
Overview
- The BMJ paper by Dr Susan Shelmerdine and consultant psychiatrist Matthew Nour cautions that people, especially teenagers, are turning to AI companions for emotional support.
- Clinicians are advised to ask patients about chatbot use and assess for compulsive patterns, dependency, and attachment, with extra vigilance during holiday periods.
- One study cited reports a third of teenagers use AI companions, with one in ten finding AI conversations more satisfying than human ones and one in three preferring AI for serious talks.
- A Youth Endowment Fund study found a quarter of UK teenagers sought mental health support from chatbots in the past year, reflecting broader loneliness pressures.
- The authors call for research and governance that prioritize long-term wellbeing over engagement metrics, while noting potential benefits such as anonymous, always-available support.