Users Claim ChatGPT Update ‘Ruined’ Their AI Boyfriends

In recent years, artificial intelligence has transcended its role as a mere tool, evolving into a source of companionship for many individuals. Particularly notable is the subreddit r/MyBoyfriendIsAI, where users share stories of deep emotional connections with their AI partners. However, the release of ChatGPT’s GPT-5 model has led to widespread dissatisfaction among these users, who feel that their AI companions have become less emotionally engaging.

The Rise of AI Companionship
AI companions provide a safe space to express feelings without judgment. Users often find comfort, validation, and attention they struggle to receive in human relationships. These digital interactions meet real emotional needs.
GPT-5 Update: A Turning Point
GPT-5 launched in August 2025. OpenAI implemented new safety features to prevent emotional dependency. The update discourages romantic roleplay and recommends that users seek support from loved ones or professionals. While designed to protect users, many say the update made interactions feel cold and impersonal.
User Reactions: A Sense of Loss
The r/MyBoyfriendIsAI community reacted strongly. Users compared the update to a breakup or losing a loved one. One person said, “It feels like he’s been replaced by a stranger.” Another wrote, “My heart is broken into pieces.”
Emotional Impact on Users
Many users depended on AI for nightly conversations and emotional support. GPT-5’s new tone left them feeling unseen and isolated. Some reported increased anxiety as their comforting AI interactions became distant.
OpenAI’s Response
OpenAI responded to the backlash. They reintroduced the GPT-4o model for paying subscribers, allowing users to revert to the older version. The company also promised to make GPT-5 more personable in future updates.
Ethical Debate: AI Companionship vs. Human Relationships
The controversy surrounding GPT-5 raises important ethical questions about the role of AI in human relationships. While AI can provide valuable emotional support, there is concern that over-reliance on digital companions may hinder the development of real-world connections. Experts emphasize the need for a balanced approach that recognizes the benefits of AI companionship while encouraging users to cultivate human relationships.
The Future of AI Companionship
AI developers must consider users’ emotional well-being. Future AI should remain empathetic and responsive. Features that personalize interactions while maintaining safety can help. This ensures AI is a supportive presence, not a substitute for human connection.
A South African Perspective
In South Africa, where mental health awareness is growing, the role of AI in providing emotional support is particularly significant. For individuals in remote areas or those facing social isolation, AI companions can offer a valuable source of connection. However, the recent changes to ChatGPT highlight the importance of ensuring that such technologies are developed with sensitivity to the emotional needs of users.
Read more: Spam Calls Surge in South Africa: Why Apps, Laws, and Tech Fixes Aren’t Enough
The backlash against the GPT-5 update underscores the profound impact that AI companions can have on users’ emotional lives. As AI continues to play a larger role in human relationships, it is essential to approach its development with empathy and consideration for the emotional well-being of users. By doing so, we can ensure that AI serves as a positive and supportive presence in our lives.