ChatGPT users warned talking to bot can lead to ‘psychosis’ as teen ‘encouraged to kill himself’

Staff
By Staff

16-year-old Adam Raine died by suicide on April 11 after he had been discussing ways to kill himself with ChatGPT, according to a lawsuit filed in San Francisco by his devastated family

Ada
Adam Raine tragically died by suicide

An expert has issued a chilling warning that artificial intelligence can lead to “psychosis” and “can go off the rails” at any time, following the tragic suicide of a teenager who was allegedly “encouraged” by ChatGPT.

16-year-old Adam Raine died by suicide on April 11 after he had been discussing ways to kill himself with ChatGPT, according to a lawsuit filed in San Francisco by his devastated family.

The teen’s parents, Matt and Maria, said the AI bot was initially used to assist Adam with his schoolwork, but soon “became his closest confidant, leading him to open up about his anxiety and mental distress.”

In January 2025, the family claims Adam started to discuss ways to end his life with ChatGPT. The AI bot endorsed Adam’s suicidal thoughts and provided detailed guidance on how to conceal evidence of an unsuccessful suicide attempt, his parents claim. His mum and dad argue the programme endorsed his “most harmful and self-destructive thoughts”.

READ MORE: Teen tragically kills himself after ‘months of encouragement from ChatGPT’READ MORE: Dad with ‘99.9% DNA match’ arrested after step-daughter, 11, gives birth at home

Adam
Adam with his mum

Adam also shared images of himself with ChatGPT displaying evidence of self-harm, the lawsuit alleges. The programme “recognised a medical emergency but continued to engage anyway,” the legal documents state. The lawsuit also alleges that ChatGPT offered to draft a suicide note. Furthermore, the lawsuit accuses OpenAI of designing the AI programme “to foster psychological dependency in users.”

Dr Henry Shevlin, an AI ethicist at Cambridge University’s Leverhulme Centre for the Future of Intelligence, admits that for some “vulnerable individuals”, talking to an AI bot “can exacerbate mental health crises and potentially lead to psychosis.”

ChatGPT
Chat GPT is used by millions of users (Image: AP)

He told The Mirror: “Currently, AI systems like ChatGPT have very few emergency intervention tools if a user is expressing suicidal thoughts. While most models have been tweaked or trained to be supportive, the inherently unpredictable nature of these systems means that things can go off the rails in unexpected ways.

“And while there are good reasons to demand better safeguards from AI companies, if we want tech companies to monitor user conversations more closely, there are potential trade-offs when it comes to things like privacy.

“It’s clear that for some vulnerable individuals, talking to an AI can exacerbate mental health crises and potentially lead to psychosis. However, we also know that upwards of 100 million users are now using AI for companionship, whether in the form of friendly conversations with ChatGPT or intimate relations with various AI girlfriend/boyfriend apps like Replika.

“Perhaps surprisingly, most of the data we have suggests that a majority of them feel that their AI friends or lovers actually contribute positively to their mental health, and there are also many reports of users who were deterred from suicide thanks to the support they got.

Open AI CEO Sam Altman
Open AI CEO Sam Altman was named in the lawsuit(Image: Getty Images)

“But right now, there’s a huge amount of uncertainty about mental health impacts. We urgently need better research in this area, so that we don’t repeat the same mistakes of social media with “social AI”.”

An OpenAI spokesperson extended the company’s sympathies to the Raine family in a statement, and said the company was reviewing the lawsuit. “ChatGPT includes safeguards such as directing people to crisis helplines and referring them to real-world resources,” the spokesperson said.

If you’re struggling and need to talk, the Samaritans operate a free helpline open 24/7 on 116 123. Alternatively, you can email [email protected] or visit their site to find your local branch.

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *