Margot van der Goot explains how ChatGPT can offer comfort, but also create dependence
8 December 2025
For people who feel lonely, an AI chatbot can be a pleasant conversational partner. According to Van der Goot, that’s because such a bot is always friendly, available and affirming. ChatGPT never interrupts, never gets angry and usually gives exactly the response you hope for. That can make you feel understood.
Margot van der Goot, senior assistant professor in Communication Science, has spent years studying how people communicate with machines. She began with research into customer service bots, but now focuses mainly on the emotional bond people may feel with chatbots.
Van der Goot is affiliated with the Centre for Urban Mental Health at the University of Amsterdam. At this centre, researchers from various disciplines work together with partners such as the Municipality of Amsterdam to improve mental health in the city.
She frequently appears in the media to discuss chatbots as friends and therapists, including de Volkskrant, NPO Radio 1, NTR wetenschap, and RTL nieuws online.
OpenAI benefits when people use it as much as possible. That works better if the chatbot feels like a warm friend who is always on your side.
ChatGPT’s warmth and agreeableness are not accidental. Van der Goot explains: ‘OpenAI, the company behind ChatGPT, benefits when people use it as much as possible. And that’s easier if the chatbot feels like a warm friend who is always on your side.’ But this can make us forget who we are actually talking to: ‘A chatbot is not a person; it’s a language model mimicking patterns in data. It doesn’t truly listen and cannot understand you the way a friend does.’
Sharing worries, fears and uncertainties with ChatGPT may have consequences for our social skills, Van der Goot explains. ‘Throughout history we’ve seen forms of “de-skilling”: losing abilities because a tool does things for us. Think of GPS; many people are now far worse at finding their way without it.’
‘Because ChatGPT is still relatively new, we don’t yet have our own research to confirm it, but the same could happen with frequent chatbot users. Because the bot tells people what they want to hear, they may unlearn how to handle human reactions such as criticism or jealousy. Human interactions may start to feel less appealing, and some people end up seeing their friends less.’
It’s essentially a global experiment without supervision.
Companies that profit from AI, such as Meta and OpenAI, are often aware of the potential harm their products can cause. Yet they do little to address it. ‘They certainly have a moral responsibility, but they don’t take it,’ says Van der Goot. ‘There are known cases of people taking their own lives after long conversations with chatbots. OpenAI’s founder recently even acknowledged that something like “AI psychosis” exists. If we knew a product carried such risks in any other sector, it would never be released without strict controls. This is essentially a global experiment without supervision.’
I want to make sure that tools like ChatGPT are discussed more often during therapy sessions.
Because her research shows that people use ChatGPT as a therapist, Van der Goot believes mental health services must pay more attention to chatbots. ‘We know people use ChatGPT this way, so we should help them understand what chatbots can and cannot offer in terms of emotional support. For young people, for instance, there is a lesson from Online Masters on AI and emotions that I contributed to. I also support therapists and professionals working with young people, for example by providing input for guidelines on how to handle this phenomenon. And it’s essential that policymakers understand the advantages and risks as well.’
‘I also want tools like ChatGPT to be discussed more frequently during therapy sessions. It can open the door to conversations about things someone might not yet have told their therapist, but has already shared with the chatbot.’
For people who turn to AI chatbots when they feel lonely or unhappy, Van der Goot offers a few tips:
I hope this development will be a wake-up call for all of us.
Van der Goot finds it difficult to predict what the future of tools like ChatGPT will look like. ‘Using AI chatbots as a friend or therapist might become increasingly important. Or it could be a passing trend. I’ve met young people who used AI for emotional support but eventually felt, “This just isn’t for me.” I hope, in any case, that this development is a wake-up call. Clearly, people have a deep need for connection and reassurance. Let’s take this as a reminder to communicate less with bots, and more with each other.’