Can Chatgpt take the place of a psychologist? A therapist answers us

If Chatgpt can be an actor in many decisions, he can even interfere in privacy. And for good reason: some souls lend him the roles of confidant, coach or psychologist.

Confide with a conversational robot. It seemed lunar several years ago, now, it is normality. According to the digital barometer published in March 2025 by the Research Center for the Study and Observation of Living Conditions (Crédoc), 26% of French people declared to use artificial intelligence in the context of privacy in 2024.

Thus demonstrating that generative artificial intelligence has imposed itself as a true ally (or enemy) in different aspects of our lives. If some use it as an aid to write motivation letters or to draw inspiration from it in the kitchen, others give it a much more eminent role and use it as a real psychologist.

And for good reason: this tool has non-negligible advantages: it is for example, available 24 hours a day, within a range of only a few clicks. But then, taking Chatgpt as therapist, is this a good idea?

AI or illusion of listening

First, it should be remembered that Chatgpt has undeniable qualities. By questioning the principal concerned, he himself claims that some people use him as “replacing” a therapist, for several reasons.

First because it has “immediate accessibility, without appointment, without judgment”. Then, because it is of a “total anonymity”. “The person does not need to speak to a human or to face the shame or the fear of the gaze of the other,” he wrote. His “cost is zero or very weak”, and he describes that his “relief is punctual”. Thus, “some people just seek to empty their bag, clarify a feeling, or feel a little less alone”.

To do a job that is deeper, I’m not sure that Chatgpt can completely do the job

On these positive points, Isabelle Brunel, psychologist in Vernouillet, is generally in accordance with Chatgpt. “I think it can be very useful, especially at first, when a person is in anguish crisis, for example, and has no immediate interlocutor. Maybe Chatgpt, and AI in general, can give keys to calm down. It is not a real person, but it gives the illusion of a person, ”she explains.

On the other hand, the specialist stresses that the limits of AI can quickly be reached, especially in the case of certain therapies. “I think this solution is not to be banned. But to do a work that is deeper, which is a little closer to analytical therapy, which takes up trauma, childhood conflicts, I am not sure that there, Chatgpt can completely do the work, “she explains.

“The human dimension is irreplaceable, but that does not mean that we should not do with AI”

It may be obvious, but it should be remembered: AI is not a human. It actually acts as a mirror, without any depth. Perceiving neither feeling nor emotion, she is unable to found with the user a real therapeutic relationship. “To what extent will you understand?” To what extent she can hear you? Indeed, perhaps the listening will be perfect, but if we are normally built, we always know that it is a robot behind, ”underlines Isabelle Brunel.

Chatgpt himself, admits being devoid of “real empathy” and unable to detect “the subtle signals that a formed psychologist can identify (trauma, mental disorders, risky behaviors …)”. AI can thus give an illusion of support, by giving the exchange of therapy tunes, but that, without any emotional transformation, without monitoring, without anchoring in a human relationship. “I can give a false feeling of security. For example: someone is bad, speaks with me, feels a little relieved … And suddenly, thinks that it is “adjusted”, while the deep problem is still there, “writes Chatgpt.

“I am not trained to manage trauma. My answers are based on statistical models, not on clinical supervision or human experience. Even if I am led to avoid clumsiness, I can badly formulate something or touch a sensitive point without knowing it, ”adds the robot.

We can mix both, intelligently

Thus, the AI will not confront the user with harsh realities, unless he asks him. “Using AI, there will always be something unresolved in social ties. There are many people who consult because they are afraid of others, because they are afraid of what they have to say, something repressed, shameful, etc. Indeed, AI can hear everything, but the real victory is to be able to express it to another person who will be able to welcome and give an answer that does not lock the subject, ”describes the psychologist.

“And again, if there is a paranoid dimension, some people may be afraid that their conversation will be recorded somewhere. The human dimension is irreplaceable, but that does not mean that we should not do with AI. You can mix both, intelligently, ”she says.

The appearance of work on oneself

However, for Isabelle Brunel, one of the fundamental points to get better, is the dimension of work, which imposes constraints. However, with Chatgpt, it is impossible: its access being extremely easy, it does not impose a “constraint system”.

In terms of constraint, first of all, there is the financial aspect. “Money is a sensitive subject, but when you pay for a consultation, you are not going to take it lightly. We make more efforts than when we talk to Chatgpt, ”explains the psychologist. But this is not the amount that counts, it can simply be a “small remuneration that meets a person’s income”.

But there is also the appearance of displacement. “The fact of moving to a cabinet, to come to a place, where you pay for your time. It demonstrates an effort and it comes into the process of a work. In addition, the firm is a significant element in therapy, which acts as a psychic container ”.

For its part, Chatgpt considers, however, that it can be a gateway for therapy. “Access to mental health is still uneven, taboo or expensive. If I can be a starting point, temporary help, or encouragement to consult, so much the better. ”

Choose the human behind the psychologist

One of the other factors that differentiates chatgpt from a human is that the person chosen as a psychologist is fully part of therapy. “There is something indefinable that is very important in psychoanalysis: this is called transfer. That is to say, what the patient will think of his shrink. He will project things on him, true or false. This will operate in therapy in an underlying way, ”specifies the psychologist.

According to her, it involves the analysis, by the patient, of the person that is the psychologist: the way in which he dresses, the one he put away his cabinet … All these details operate. “It may be the most complicated part in therapies, but it is certainly a non-negligible part of healing to hear as a” better way “. There is the part of what is said, and there is the part which is linked to the identification of the patient and his projections on his shrink, ”she adds.

We find someone listening to what someone else has never listened to

This transfer proves, once again if necessary, that the human link is at the origin of any healing. “The basis is love. It is not the feeling of love, but it is the feeling of loving. This is what makes all the richness of the link with a shrink, it is that all of a sudden, we find someone who listens to what someone else has never listened to. ”

There where Chatgpt is more imposed as a “transitional object”. “It is this famous ability to consider something that we know is false, but we pretend that it is true. A bit like children’s comforter. The child knows very well that the blanket is not alive, but at the same time, he acts as if by magic. This is something that gives comfort ”.

Thus, failing to take the place of a therapist, the AI can completely serve as a complement. According to Chatgpt, it can even become a “temporary support tool”.

To the question: Can Chatgpt replace a therapist? The answer is no, according to Isabelle Brunel. “I don’t think an AI will replace the profession, it is not possible. It is a very poor substitute for a presence, “she said, adding a nuance,” after the AI is only on its beginning. We do not yet know his future abilities to hear emotions and detect them. I still think that the link to humans is fundamental. In the film “Blade Runner 2049”, Ryan Gosling has a very pretty and attentive robot friend. He is very attached to it, but the film succeeds in showing the extreme solitude of the character. ”

Comments (0)
Add Comment