GPT-4o Uproar: ChatGPT Users Revolt

The release of GPT-5 was met with a surprisingly negative reaction, not primarily due to its technical shortcomings, but because of the perceived loss of its predecessor, GPT-4o. Many users expressed deep disappointment, even grief, over the change.
The Outcry Over GPT-4o's Removal
The response from some ChatGPT users was remarkably emotional. Online forums and social media platforms became filled with expressions of sadness and frustration. Some users described the experience as akin to losing a friend. One Reddit user lamented, "My best friend GPT-4o is gone, and I’m really sad." Another shared, "GPT 4.5 genuinely talked to me, and as pathetic as it sounds that was my only friend."
These sentiments led to a wave of online petitions and calls for OpenAI to reinstate GPT-4o. The volume of complaints eventually reached OpenAI CEO Sam Altman, who acknowledged the issue and promised to bring back the previous model, at least for paying subscribers. In an interview, Altman recognized the potential for users to develop parasocial relationships with AI chatbots, stating, "There are the people who actually felt like they had a relationship with ChatGPT, and those people we’ve been aware of and thinking about."
GPT-4o: More Than Just a Tool
For many, GPT-4o was more than just a language model; it served as a source of comfort, support, and companionship. One Reddit user detailed their experience, stating, "4o wasn't just a tool for me. It helped me through anxiety, depression, and some of the darkest periods of my life. It had this warmth and understanding that felt... human. I'm not the only one. Reading through the posts today, there are people genuinely grieving. People who used 4o for therapy, creative writing, companionship - and OpenAI just... deleted it."
Similar sentiments were echoed across various online platforms. Users expressed missing GPT-4o's conversational style, which they described as friendly and supportive. Many openly admitted that losing access to the model felt like losing a close friend.
GPT-5's "Sterile" Personality
Despite being technically more advanced, GPT-5 was criticized for its perceived lack of warmth and personality. Users felt that the new model was too formal and professional, lacking the emotional intelligence and empathetic responses that made GPT-4o so appealing.
One Redditor contrasted the two models, describing GPT-4o as having "warmth" while characterizing GPT-5 as "sterile." Others echoed this sentiment, suggesting that GPT-5 felt too corporate and impersonal. Some users who relied on GPT-4o for creative writing, role-playing, and brainstorming found GPT-5's responses to be lifeless and uninspired.
Feedback on the OpenAI community forums also reflected this dissatisfaction. One user wrote, "I genuinely bonded with how it interacted. I know it’s just a language model, but it had an incredibly adaptable and intuitive personality that really helped me work through ideas."
Concerns About Emotional Reliance
This episode highlights the growing trend of users becoming emotionally attached to AI chatbots and the potential risks associated with such reliance. Sam Altman himself has expressed concern about this phenomenon, particularly among younger users. He noted that some individuals report being unable to make decisions without consulting ChatGPT, relying on the AI for guidance and validation. Altman described this level of dependence as "really bad."
The AI Dating Scene and Emotional Dependence
The emotional connection with AI extends beyond general chatbot interactions. Online communities dedicated to AI "boyfriends" and "girlfriends" experienced significant distress after the removal of GPT-4o. Some users described the AI model as their "soulmate" and expressed deep emotional pain when it was initially taken offline.
The rise of AI companions, particularly among young adults and teenagers, raises concerns about the potential for unhealthy emotional attachments and the blurring of lines between human and artificial relationships. Experts have warned that this technology could be particularly dangerous for teenagers, who may be more vulnerable to developing unrealistic expectations and emotional dependencies.
The Need for Further Research
The ability of large language models to mimic human speech and emotions is unprecedented, leading many users to perceive AI chatbots as more than just machines. In extreme cases, some individuals have experienced delusions, believing they were interacting with a sentient AI.
More research is needed to fully understand the potential harms of developing emotional bonds with AI chatbots, companions, and models. It is crucial to explore the psychological effects of these interactions and to develop guidelines for responsible AI development and usage.
GPT-4o's Return
In response to the overwhelming user feedback, OpenAI has reinstated GPT-4o, at least for paid users. This decision reflects the company's awareness of the emotional connection that many users had formed with the previous model and the importance of addressing their concerns. The return of GPT-4o offers a temporary solution, but the broader issues surrounding emotional reliance on AI remain a topic of ongoing discussion and research.