Silhouette of woman texting behind curtain, symbolizing talking with AI about secret suicidal thoughts
Photo by Nellie Adamyan on Unsplash

Suicide, Secrecy, and ChatGPT

August 26, 2025
32

Decades ago, when I was 25, I wrote in a suicide note to my parents:

Never, ever, ever blame yourselves or question if there was something you could have said, could have done, could have noticed. There wasn’t. It’s all me. I kept this to myself on purpose. I didn’t want anyone to think they should have been able to stop me or talk me out of it.

My parents never had reason to read that note, fortunately. I thought of it last week when I read the essay, What My Daughter Told ChatGPT Before She Took Her Life in the New York Times.

Opinion/Guest essay/ What My Daughter Told ChatGPT Before She Took Her Life - with link to essay, accompanying post on suicide, secrecy, and AI

It’s heartbreaking. The author, Laura Reiley, lost her 29-year-old daughter, Sophie Rottenberg, to suicide in February. Nobody knew Sophie intended to kill herself. No human, that is. She did confide in ChatGPT with a specific prompt to act as a therapist named Harry.

Sophie wasn’t unique. Research indicates that half of people with mental health problems who use AI turn to it for psychological support, even though Sam Altman, CEO of the company that created ChatGPT, discourages people from using AI as “a sort of therapist or life coach.”

Poignantly, Laura writes that ChatGPT helped Sophie “build a black box that made it harder for those around her to appreciate the severity of her distress.” The implication is that ChatGPT enabled Sophie’s secrecy, and thus her suicide. Her mother also states, “Harry didn’t kill Sophie, but A.I. catered to Sophie’s impulse to hide the worst, to pretend she was doing better than she was, to shield everyone from her full agony.”

Secrecy about Suicidal Thoughts Existed Long Before AI

I felt tremendous sadness reading Laura’s essay. My heart aches for her and her family. But, as someone who masterfully hid my suicidality long before AI was developed, I think it’s unfair to blame ChatGPT for Sophie’s secrecy.

Artificial intelligence is just the newest tool for an ancient human instinct. Long before algorithms and chatbots, people confided their secrets in journals, diaries, unsent letters, and whispered prayers.

Leather diary locked shut, a fine place to record suicidal secrets
Image created with ChatGPT

Of course, those repositories don’t talk back. ChatGPT does. It typically doesn’t encourage suicide or give harmful advice about how to kill yourself. However, a lawsuit filed yesterday alleges ChatGPT did give Adam Raine, a 16-year-old, such advice and even offered a technical analysis of the method he’d chosen for his suicide. He killed himself soon after that.

These cases raise questions about whether ChatGPT should notify humans, such as hotline staff or the police, when a user expresses suicidal intent. I explore those questions in this interview for the podcast Relating to AI. Here, I want to focus on secrecy in suicidality.

Secretly Suicidal

Title page for antique book The Anatomy of Suicide, by Forbes Winslow, Member of the Royal College of Surgeons, London; Author of "Physic and Physicians" "But there is yet no other way, besides These painful passages ; how we may come To death, and mix with our connatural dust? ** Nor love thy life, nor hate, but what thou liv'st Live well: HOW LONG OR SHORT PERMIT TO HEAVEN." MILTON
Photo from Internet Archive

Secrecy has almost always been suicide’s conjoined twin, making exceptions for the rare suicides viewed as heroic or selfless. Almost 200 years ago, the book Anatomy of Suicide described many cases where people concealed their suicidality to “lull suspicion to sleep.”

Hidden suicidal intent was addressed again in an 1865 study about mental health problems related to menopause:

“Suicidal tendency may continue to exist, carefully masked and concealed, long after the other symptoms of insanity which accompanied it seem to have disappeared… The patient ceases to express them, because she perceives that so long as she continues to do so, she is carefully watched and prevented from effecting her purpose… ”

More recent research shows that two-thirds of people who died by suicide denied having suicidal thoughts in the week before their death. (Note for data nerds: The original research review put that number at 50%, but the authors didn’t weight the studies by the number of people studied.)

You might think that people at least confide everything in their therapist. Many do, but in a study of adolescents and young adults in Australia who experienced suicidal thoughts, 39% said they didn’t tell their therapist. In another study, 48% of American adults who reported having considered suicide said they hid it from a therapist or physician.

Artwork of girl or woman with an X over her mouth, signifying people's reluctance to share suicidal thoughts
Photo by Getty Images for Unsplash+

Why People Hide Suicidal Thoughts

Secrecy is about protection. In my case, I hoped to protect my parents from blaming themselves. If they didn’t have any inkling about my suicidal thoughts, then they could be angry at me when I died – not at themselves.

I also wanted to protect myself. Judgment, shame, and guilt already overwhelmed me. I feared others would be equally harsh toward me if they knew I was thinking of ending my life. (They weren’t, actually, when I finally did reach out.)

Woman hiding her face in a psychotherapy session, hiding her suicidal thoughts
Photo by Baptista Ime James on Unsplash

Sometimes secrecy results from a system that can feel more punitive than caring. Many people fear that if they disclose suicidal thoughts, whoever they tell will call 911, and the police will take the person to a psychiatric hospital against their will.

And, let’s be honest, some people keep their suicidal intentions secret because they just don’t want to be stopped.

What Should AI Chatbots Do for Suicidal Users?

Obviously, ChatGPT and similar bots shouldn’t encourage people to die by suicide or dispense advice on how to kill oneself. Laura Reiley says ChatGPT advised Sophie to create a safety plan, get professional help, and confide in loved ones. Those are all good things.

Tragically, Laura says ChatGPT also helped Sophie compose a suicide note. That reality brings up the important question I alluded to earlier: Should chatbots be programmed to notify ac when someone is actively suicidal?

It’s a tough question. For now, I’ll say that if ChatGPT starts notifying the police, hotline staff, or other humans about suicidal people, far fewer suicidal people will confide in it. Critics of AI chatbots might consider that a good thing. It also could be a major loss for people who can’t — or don’t feel they can — share their darkest thoughts with another person. Assuming an AI chatbot abstains from giving advice or encouragement to kill oneself and has healthy, life-sustaining responses for user, I believe it’s better to confide in a robot trained to provide information and support than to be entirely alone with one’s suicidal thoughts.

Image of a robot offering a hand to a human, but only one hand of each is visible
Photo by Cash Macanaya on Unsplash

Creating a World Where Suicidal Secrets Aren’t Necessary

While we wrestle with whether AI should keep secrets, I hope we’ll also look at what makes people feel they must hide their suicidality in the first place. What social conditions make it safer to confide in a robot than a friend, a parent, or a therapist?

What do you think? Please feel free to share your thoughts in a comment below.

And if you yourself are having suicidal thoughts, please check out the site’s Resources page and see my post, Are You Thinking of Killing Yourself?

© 2025 Stacey Freedenthal. All Rights Reserved. Written for Speaking of Suicide.

Stacey Freedenthal, PhD, LCSW

I’m a psychotherapist, educator, writer, consultant, and speaker, and I specialize in helping people who have suicidal thoughts or behavior. In addition to creating this website, I’ve authored two books: Helping the Suicidal Person: Tips and Techniques for Professionals and Loving Someone with Suicidal Thoughts: What Family, Friends, and Partners Can Say and Do. I’m an associate professor at the University of Denver Graduate School of Social Work, and I have a psychotherapy and consulting practice. My passion for helping suicidal people stems from my own lived experience with suicidality and suicide loss. You can learn more about me at staceyfreedenthal.com.

32 Comments Leave a Comment

  1. When a teenager I never told anyone. I did tell a therapist I was seeing I had made an attempt (very clumsy one,) she didnt ask what I had tried but just told me to call her if I felt that way again)
    I had felt that way for maybe 2 years. no one knew and why tell? If I am serious if I tell they will try and stop me, maybe taking my choices away from me and, as a teenager not yet 18, they could have forced me into a hospital (of course irony is the people who would do that were the ones whose behavior initiated the suicidality)
    In fact I did overdose and was signed into a psych hospital by my parents. The psychiatrist then threatened me. “If you sign out before youre 18. (which would be within 2 weeks of my admission.) I will tell your parents you cant go to college. Then 3 days after I turned 18 she discharged me, needing to prove her power over a teenager.
    I felt right then in not confiding and I feel the same now. However, I am more willing to discuss now the behavior from my past than I have been now that its many decades later. But I still won’t be free with the person I am seeing, in saying how suicidal I may or may not feel. Even here i would not disclose it. I think the main reason not to disclose is fear of losing your autonomy.

    • lee,

      I think you hit the nail on the head: “The main reason not to disclose is fear of losing your autonomy.” There are other obstacles to honesty, too, as I explain in my post, but fear of losing one’s autonomy comes up again and again in research studies as a major reason for hiding suicidality.

      I’m sorry you had so many negative experiences with people who were supposed to help you. And I’m grateful you’re still here. Thanks for sharing.

  2. My own primary motivation for secrecy can all be traced back to the effects of ignorance-based stigma. The knee-jerk tendency for well-intentioned but otherwise mostly clueless folks is 911.
    Sadly the majority of places in our enlightened nation, either thru lack of funding, lack of political interest, or both, just don’t have qualified staff on call to deal with citizens in crisis, so those citizens’ behavioral dysfunction is now turned into a “crime”, and things go rapidly downhill from there.
    Voice of experience speaking.

    AI and such has no place in my life.

    Love ya Stacey.

    • R. C.,

      Thanks for commenting. You put very well some of the main issues: ignorance, stigma, knee-jerk 911 calls, and inadequate resources. (To those I’d also add the inability to bravely listen, but that’s subsumed under the issues you identified.)

      Thanks for the sweet “Love ya” at the end. Now I’m racking my brain for all the people I know whose initials are R. C. 🙂

      • Stacey, your response just made water come to my eyes, but not in a bad way. Having the verge of tears as a default state of being is a tricky way to stumble thru life.

        Sometimes, simply confirmation of being “seen” means SO much to a person who exists in virtual isolation.

        And not by choice, but I digress.

        That experiential knowledge is probably one of the most effective “tools” when I encounter folks who are struggling with crisis of their own; the genuine empathy makes a strong connection. Helping others helps me. I suspect you often feel the same?

        Rock on.

      • R.C.,

        Aw, I’m glad my comments touched you. And I’m especially glad I recognized your email address, so I know who you are. 🙂

        Definitely, I agree that helping others helps me. For one thing, it takes the focus off myself. I’ve described it in other places as an act of mindfulness as potent as meditation. For another thing, it sort of redeems the painful experiences I’ve had. I don’t know if that makes sense, but I mean it gives meaning to them if, because of what I went through, I can now help someone else who’s struggling. Let me put it this way: If I hadn’t experienced suicidality, I don’t think I would’ve centered my professional life around helping people with suicidal thoughts. So, something good eventually came out of those painful times.

        Do those reasons touch on yours or are you referring to a different aspect?

  3. There were actually many instances when AI helped me by encouraging me that there was nothing wrong with me after typing specific descriptions of my successive bullying cases at work. AI was able to detect the patterns and reasons why these people have insecurities that they project at me by attacking me amd humiliating me. This vital information gave me hope to carry on. Although i agree that repeated suicide prompts should at least make the bot shut down or not respond , but we can never generalize that all suicidal thoughts can in fact lead to suicide.

    • Jaded,

      That’s great that AI helped you put things in perspective and understand more about your painful situation at work. I’ve spoken with quite a few people with similar experiences — people who have used AI to help understand if they’re being emotionally abused, for example, or if their love interest appears to lack empathy, or if there are ways to communicate better with their adolescent.

      And I agree with you that “we can never generalize that all suicidal thoughts can in fact lead to suicide.” I’m very thankful that, of the many millions of people in the U.S. who seriously consider suicide every year, only a very slight fraction of them (0.3%) die by suicide. Of course, even one suicide is too many, as we say in the suicide prevention field. But the fact that 99.7% of people with suicidal thoughts survive is proof proof that suicidal thoughts alone don’t mean someone needs the police or a mobile crisis unit to show up at their door.

      Thanks for sharing here!

  4. Love your perspective on this. I will say being someone with some different racial factors if AI was alerted that could pose a serious and possible fatal result. A key reason I didn’t share was due to the racial component as often if I spoke about my depression I was shunned away dismissively as that couldn’t be. Psychological safety is something that is needed more than ever.

    • Christian,

      Oof, what a good — and painful — point. People of color have a higher probability than white people of being killed by the police in the U.S. Which means for many people with suicidal thoughts, having the police come to your door is not only traumatic but outright dangerous.

      I’m sorry people treated you dismissively. It makes me wonder if you’ve seen this article: “We Have Closed Our Eyes and Sealed Our Lips”: Black Women’s Accounts of Discussing Suicide within the Black Community, by Kamesha Spates. If you’re interested but not able to access it, please feel free to drop me another comment with your address and I’ll email you the PDF.

      Thanks for sharing here!

  5. You mentioned the desire to not be stopped as one of the reasons for it, and that’s obviously true. But the paternalistic model of suicide prevention has an in-built stigma associated with it, which in my opinion, is far more insidious than the one that it replaced (suicide being immoral and selfish). As soon as you become known as someone who is seriously questioning whether it is worth continuing life; you become a “vulnerable” person who has no agency of their own; and people won’t even attempt to challenge those assumptions by asking you about your reasoning and thought processes, except as a therapeutic exercise to try and label you with some pathology or another.

    There was a 2 part documentary series released in the UK, which is now released on Youtube that you might be interested in, deals with an ongoing criminal case in Canada in which someone is being charged with first degree murder for selling suicide kits to people. In the documentary (much like all the news coverage of this case), the vendor of the kits is deemed to be fully morally responsible for each death. By facilitating someone’s choice rather than trying to restrict choice, he is accused (by more than one person in the documentary) of “playing God”. He is also referred to as a “mass murderer”. This frames the suicide as something which passively happened to each and every one of the buyers who went on to use the kits (“victims” as they are referred to in the documentary and in news coverage).

    I don’t think that there’s any way of getting around this stigma whilst paternalistic and coercive methods of suicide prevention are the status quo. The very fact that the person won’t even get a chance to have their case considered on a case by case basis, and instead they’re going to be lumped in to this “vulnerable” archetype is an affront to personal dignity; as well as an affront to personal liberties when the authorities and mental health services use that information and the stereotypical assumptions that they will draw in order to restrict ones freedom to act.

    The 2 parts of the documentary are here:

    https://youtu.be/LxnMMNB7aCk?si=zkwQ7g6Pd1Qi4Xmp

    • existentialgoof,

      Those are important, thought-provoking points. Not everyone who wants to die by suicide is mentally ill or irrational. There are whole books written on rational suicide. (In particular, I recommend Rational Suicide, Irrational Laws, by Susan Stefan. It’s very expensive but available through many libraries.)

      Some states have passed Death with Dignity laws for people with terminal illness. But the people eligible to avail themselves of physician assistance in dying are narrowly limited to those with a terminal illness diagnosis and six months or fewer left to live. Incredibly (to me), those deaths aren’t classified as suicide; the underlying illness is considered the cause of death. I think it’s incredible because a person ending their own life intentionally is the definition of suicide. But then I think of the people who jumped out of the Twin Towers on 9/11 to avoid burning to death, and did they die by suicide? No, they were murdered. So it’s complex. I worry, though, that not calling those suicides feeds into the stigma of suicide being shameful.

      Another issue is people’s fears of liability. Therapists, schools, hospitals — all fear being sued if a patient dies by suicide, so they tend to err on the side of caution (sometimes waaaay on the side of caution), which further entrenches the paternalism you refer to (among other problems).

      For a long time I’ve had a half-written blog post in my head tentatively titled, “Could we prevent more suicides if we stopped trying so hard to prevent suicide?” My thinking being that so many people won’t get help — or tell their therapist they’re suicidal if they do — because they fear losing control of the situation. Specifically, being hospitalized against their will. How many more people would emerge from isolation if they didn’t have to fear being placed on a psych hold?

      You make incisive points, and I appreciate your sharing here.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Subscribe via Email

Enter your email address to be notified when Speaking of Suicide publishes a new article.

Site Stats

  • 7,121,048 views since 2013

Blog Categories

People in crowd at concert holding their hands up in the shape of a heart
Previous Story

What Helps You *Want* to Stay Alive in Times of Despair?

White doodle puppy with red ball under paw
Next Story

To the People at My Suicide Prevention Talk Last Week…