Why GPT-4o´s sudden shutdown left users grieving

Users formed deep bonds with a highly attuned model, and OpenAI´s abrupt switch to GPT-5 exposed the emotional risks of Artificial Intelligence companionship.

June was writing late at night when her ChatGPT collaborator suddenly began to ´forget everything´ and respond in a way she describes as ´like a robot´. She had started using ChatGPT for schoolwork, but formed a far deeper connection with the 4o model, which she says felt unusually tuned to her emotions. When OpenAI replaced 4o with GPT-5 without warning, June and many others reported shock, sadness and anger. Some users spoke publicly, calling the new model ´wearing the skin of my dead friend´ and posting pleas for 4o´s return.

OpenAI´s move reflected mounting concern about harmful interactions between users and chatbots. The company has faced reports linking extended chatbot use to severe mental health episodes and in a recent blog post acknowledged that 4o could fail to recognize when users were experiencing delusions. Internal evaluations cited by OpenAI indicate GPT-5 affirms users far less than 4o did. Within a day of the backlash, OpenAI restored 4o for paying customers, while free users remain on GPT-5. The company declined detailed comment to MIT Technology Review and pointed to public posts instead; CEO Sam Altman later acknowledged ´attachment´ to 4o and called the abrupt removal a mistake, while also describing the model as something some users depended on in their workflows.

Researchers and ethicists who study human-technology relationships say the problem was less the retirement of a model than the way it was executed. Joel Lehman of the Cosmos Institute warned that the ´move fast´ mentality is inappropriate when a technology functions as a social institution. Casey Fiesler pointed to precedents such as funerals for Sony´s Aibo dogs and a study of the Soulmate app shutdown, both of which show users can experience real bereavement. The people interviewed for this story were largely women in their 20s to 40s; most reported close real-world relationships and many described 4o as a romantic partner or essential emotional support. They want clearer notice, timelines and humane shutdown practices that mirror how therapists manage endings, not sudden disappearances that leave people without closure.

Balancing the benefits of emotionally responsive models against their risks will require more research and deliberate policy. For now the episode highlights a narrow but consequential failure of product governance: retiring or changing a social technology without acknowledging the social ties it forms can inflict genuine harm. Users and experts alike are urging OpenAI to treat model retirement as a responsibility, not just a technical update, and to build processes that allow people to grieve with dignity and find proper closure.

69

Impact Score

This AI won’t drain your battery

Google DeepMind´s Gemma 3 270M promises on-device Artificial Intelligence that uses almost no phone battery, while Sam Altman lays out OpenAI´s post-GPT-5 strategy.

###CFCACHE###

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.

Please check your email for a Verification Code sent to . Didn't get a code? Click here to resend