Artificial intelligence is today everywhere in our daily lives. But this omnipresence is not without risks, some people going so far as to develop for their chatbot an emotional attachment. And the restoration of old chatgpt models by OPENAI following the lively reactions of users shedding light on the dangers of this dependence.
If artificial intelligence can sometimes be a solution, it can also turn out to be a problem. This booming technology is increasingly essential in our daily lives, until it becomes a real companion For some people. And if there are cases, certainly marginalized, of individuals who personalize their chatbot in order to make them their boyfriend; Other less extreme situations can also alert. And user attachment to models preceding Chatgpt-5who pushed Sam Altman, the boss of Openai, to restore them, is one.
Obsession for Chatgpt-4o Alert on the emotional dependence of users
Altman has a theory to explain the strong reactions of users to the disappearance of old models, and especially Chatgpt-4o. According to him, This attachment is emotional dependence : No one would have ever supported them before and they would have transposed this need on AI. Although he seems worried, Altman is nonetheless an entrepreneur: GPT-5 predecessors are now paying, requiring a chatgpt plus subscription to $ 20 per month to access it. And our colleagues from Windows Central recall that Optai has also increased the chatgpt-5 rate more in response to user critics-apparently in return for a significant increase in reasoning rate limits.
And this emotional attachment for chatbots is not the only drift linked to AI. For example, some users swear by AI to make their decision according to Altman, who describes this situation as ” dangerous ». Just like the fact that some people completely turn on human professionals to entrust their mental health only to the chatbot-which the boss of Openai would not do himself without a doctor to supervise the process. Beyond the emotional aspect, dependence on AI could also atrophy the critical thinking of users, and thus deteriorate their cognitive functions-in summary: to make them stupid. But again, companies that create the “poison” Also seem propose “the antidote”. Indeed, Microsoft has just unveiled Guide Learning, its solution – based on AI – to combat total dependence on AI, which is based on a in -depth understanding of a subject, rather than rapid responses.