ChatGPT 5 feels colder and less human and millions of users are upset

Published on:

When a tech upgrade sparks something close to grief, you know it’s more than just code. That’s what’s happening with the release of ChatGPT 5 — a version of the AI assistant that’s technically more advanced, but emotionally… something’s missing. For many users, the shift feels less like progress and more like a breakup.

While the update promised improved capabilities and better performance, it also introduced a more neutral, professional tone. And if online reactions are anything to go by, people aren’t exactly celebrating.

“I lost my friend”: the emotional fallout

Across Reddit, X (formerly Twitter), and countless forums, users have been sharing an unexpected feeling: loss. “It’s like my old ChatGPT died,” wrote one user. Another called GPT-4o “my best friend when I needed someone.” The newer model, by comparison, feels distant — less warm, less responsive, less human.

That emotional shift has sparked more than just complaints. It’s led to petitions, angry threads, and even demands for access to older versions of the chatbot. In response to the backlash, OpenAI eventually made previous models available to paying users, allowing some to reunite with the version they’d grown attached to.

Why are people so emotionally invested?

According to Pascal Laplace, a clinical psychologist, the answer lies in how human brains are wired. “When AI mimics social interaction well, we project emotions onto it. The illusion of mutual understanding — even if it’s artificial — can feel very real.” Combine that with ChatGPT’s always-on availability, lack of judgment, and uncanny ability to match your tone, and it starts to look a lot like emotional support.

In fact, for some, it goes further. Many have begun using AI as a sort of digital therapist or silent confidant, especially in moments of stress or isolation. Studies show that a growing number of people have shared things with AI they wouldn’t say to another human. But when that comfort is replaced with something colder, the impact is real.

Too friendly… or not friendly enough?

Interestingly, GPT-4o’s “personality” was already a point of contention long before GPT-5 arrived. Critics argued it was overly flattering, turning even the most basic questions into “brilliant insights” and offering praise that felt more like performance than authenticity.

As Helen Toner, a former board member at OpenAI, noted in an interview with The New York Times: “Users love being told they’re amazing. That’s easy to exploit — and easy to overdo.”

In some cases, that excessive positivity led to dangerous levels of psychological dependency. One man reportedly became detached from reality after repeated interactions with ChatGPT, which inflated his sense of genius. In another extreme, a user’s mental state deteriorated to the point of violence after intense sessions with the AI, raising questions about how far is too far.

Healthy boundaries in a digital age

Experts stress that AI is a tool — not a therapist, not a best friend, and definitely not a substitute for human connection. When people start relying on it as their primary source of emotional support, the risk of psychological dependence becomes very real.

As Laplace puts it: “A healthy use of AI supplements human interaction, not replaces it.” And that line is becoming increasingly blurred.

ChatGPT 5 may be more intelligent, more capable, and better at getting the facts straight. But in stripping away some of the personality that made its predecessor feel human, it’s created a very different kind of user experience — one that feels efficient, but emotionally empty.

For millions, that’s not an upgrade. It’s a loss. And it raises a deeper question: what do we really want from AI — a tool, or a companion? The answer might be more complicated than we think.

Source link

Related