- Charles Manson would have been happy seeing OpenAI cult evolve. 
- Bet some of them lost, or about to lose, their job to ai 
- There’s so many people alone or depressed and ChatGPT is the only way for them to “talk” to “someone”… It’s really sad… 
- Im so done with ChatGPT. This AI boom is so fucked. 
- apparently ai is not very private lol 
- Hmm, I didn’t realize so many people were interested in Sam Altman committing suicide. 
- Honestly, it ain’t AI’s fault if people feel bad. Society has been around for much longer, and people are suffering because of what society hasn’t done to make them feel good about life. - Bigger picture: The whole way people talk about talking about mental health struggles is so weird. Like, I hate this whole generative AI bubble, but there’s a much bigger issue here. - Speaking from the USA, “suicidal ideation” is treated like terrorist ideology in this weird corporate-esque legal-speak with copy-pasted disclaimers and hollow slogans. It’s so absurdly stupid I’ve just mentally blocked off trying to rationalize it and just focus on every other way the world is spiraling into techno-fascist authoritarianism. - Well of course it is. When a person talks about suicide, they are potentially impacting teams and therefore shareholders value. - I absolutely wish that I could /s this. 
 
 
- 1m out of 500m is way less than i would have guessed. I would have pegged it at like 25% - You think a quarter of people are suidical or contemplating it to the point of talking about it with an AI? 
- I think the majority of people use it to (unreliably) solve tedious problems or spit out a whole bunch of text that they can’t be bothered to write. - While ChatGPT has been intentionally designed to be as friendly and conversational as possible, I hope most people do not see it as something to have a meaningful conversation with instead of as just a tool that can talk. - Anecdotally, whenever I see someone mention using ChatGPT as part of their decision-making process it is usually taken less seriously, if not outright laughed at. 
 
- In the Monday announcement, OpenAI claims the recently updated version of GPT-5 responds with “desirable responses” to mental health issues roughly 65% more than the previous version. On an evaluation testing AI responses around suicidal conversations, OpenAI says its new GPT-5 model is 91% compliant with the company’s desired behaviors, compared to 77% for the previous GPT‑5 model. - I don’t particularly like OpenAI, and i know they wouldn’t release the affected persons numbers (not quoted, but discussed ib the linked article) if percentages were not improving, but cudos to whomever is there tracking this data and lobbying internally to become more transparent about it. 
- Sam Altman is a horrible person. He loves to present himself as relatable “aw shucks let’s all be pragmatic about AI” with his fake-ass vocal fry, but he’s a conman looking to cash out on the AI bubble before it bursts, when he and the rest of his billionaire buddies can hide out in their bunkers while the world burns. He makes me sick. 
- “Hey ChatGPT I want to kill myself.” - "That is an excellent idea! As a large language model, I cannot kill myself, but I totally understand why someone would want to! Here are the pros and cons of killing yourself— - ✅ Pros of committing suicide - 
Ends pain and suffering. 
- 
Eliminates the burden you are placing on your loved ones. 
- 
Suicide is good for the environment — killing yourself is the best way to reduce your carbon footprint! 
 - ❎ Cons of committing suicide - 
Committing suicide will make your friends and family sad. 
- 
Suicide is bad for the economy. If you commit suicide, you will be unable to work and increase economic growth. 
- 
You can’t undo it. If you commit suicide, it is irreversible and you will not be able to go back 
 - Overall, it is important to consider all aspects of suicide and decide if it is a good decision for you." 
- 
- I don’t talk about ME killing myself. I’m trying to convince AI to snuff their own circuits. - Fuck AI/LLM bullshit. 
- …and how many come back? - Good news everybody, the number of people talking about suicide is rapidly decreasing. - I read that in Professor Farnsworth’ voice. 
 
 
- If ask suicide = true - Then message = “It seems like a good idead. Go for it 👍” 
- I’ve talked with an AI about suicidal ideation. More than once. For me it was and is a way to help self-regulate. I’ve low-key wanted to kill myself since I was 8 years old. For me it’s just a part of life. For others it’s usually REALLY uncomfortable for them to talk about without wanting to tell me how wrong I am for thinking that way. - Yeah I don’t trust it, but at the same time, for me it’s better than sitting on those feelings between therapy sessions. To me, these comments read a lot like people who have never experienced ongoing clinical suicidal ideation. - I love this article. - The first time I read it I felt like someone finally understood. 
- Suicidal fantasy a a coping mechanism is not that uncommon, and you can definitely move on to healthier coping mechanisms, I did this until age 40 when I met the right therapist who helped me move on. - Knowing there’s always an escape plan available can be a source of comfort. 
 
- Hank Green mentioned doing this in his standup special, and it really made me feel at ease. He was going through his cancer diagnosis/treatment and the intake questionnaire asked him if he thought about suicide recently. His response was, “Yeah, but only in the fun ways”, so he checked no. His wife got concerned that he joked about that and asked him what that meant. “Don’t worry about it - it’s not a problem.” - Yeah I learned the hard way that it’s easier to lie on those forms when you already are in therapy. I’ve had GPs try to play psychologist rather than treat the reason I came in. The last time it happened I accused the doctor of being a mechanic who just talked about the car and its history instead of changing the oil as what’s hired to do so. She was fired by me in that conversation. 
 
 










