ChatGPT goes to therapy: The new emotional economy



Artificial intelligencein its various forms, it has entered deeply into the lives of many of the patients I work with in my clinical practice. Nowhere is this more evident than in the stories they tell me about using AI to navigate complex emotional situations.

I’ve heard a growing number of stories of people outsourcing their personal communication to AI chatbots like ChatGPT for everything from writing a note to a demanding boss, writing a goodbye note to a loved one, or even composing a poem for a deceased parent. In these talks, I envisioned a new emotional economy: one in which algorithms mediate human expression, creating both opportunities and challenges for psychic growth.

But this is not limited to adults. Worrying New York Times article told the story of a teenager who was between suicide thoughts, not to parents, peers or counselors, but to ChatGPT for therapy. The teenager described the chatbot as a lifeline in times of despair.

This extraordinary and troubling example reveals both the promise and danger of AI for young people. AI can provide instant comfort and structured dialogue, but it also risks replacing real human contact when needed. As educators, parents, and clinicians, we need to recognize that students are increasingly engaging with AI not just as a tool, but as a partner in their inner lives.

AI allows the “false self” to distort relationships

In my own experience, I have seen several ways that people use ChatGPT. A clear example is this projection from the “false self”. Sick with hot and sympathy style clashed with a domineering leader who demanded a strong and decisive presence. He instructed ChatGPT to “write a memo that sounds active, masculine, and authoritative.” The result was effective, but separated her from her true self. This mirrors what many students experience at school, where it is desirable and social peer pressure can lead them to adopt voices and personalities that do not reflect their inner lives. AI becomes a tool for this prediction, helping them fulfill their expectations while denying their reality personality.

Another patient, paralyzed by the task of writing a divorce letter, turned to ChatGPT. The first attempt sounded like a corporate termination notice. The revised version was more tender, but still, as he put it, “not me.” Outsourcing his most vulnerable interactions offered short-term relief but underscored a deeper alienation. proximity. Many students use AI-generated essays, emails, or messages to avoid the discomfort of struggling with academic tasks or even social conflicts. While still functional, it distracts them from the task of developing their own word search.

A third patient asked ChatGPT for a humorous yet loving poem for her elderly mother. The AI ​​produced clever anecdotes and polished lines, but they were contrived and oddly empty. The patient was satisfied – it met social expectations – but there was no emotional depth. Teens also often use AI to craft the “right” message to peers, teachers, or parents. This highlights the tension between the touristic performance of connection and the messy, authentic expression that actually deepens relationships.

Beyond the individual cases, I see a broader trend: when patients turn to AI shame prevents direct contact. Embarrassed by financial struggles, a patient asked ChatGPT to write a request for a fee reduction. The result was formal and informal in contrast to his usual candid style. I suddenly found myself in a three-way relationship, not only with him, but also with the AI ​​voice. Students can do the same when teachers ask for more time, coaches for more playing time, or peers. to forgiveoutsource the courage it takes to ask directly.

A couple in conflict have even used ChatGPT as mediators, only to later discover that they both relied on AI to write reconciliation messages. This role of “assistant therapist” can regulate emotions and prevent tension, but it begs the question: Will people turn to each other from AI, or will AI risk becoming a permanent buffer against intimacy?

In New York Times the story of a teen using ChatGPT for therapy underscores what I hear from young people every day: Students are looking for immediacy, structure, and a sense of being heard, sometimes in places adults would never expect. For young people facing anxiety, depressionor everyday turmoil adolescenceAI offers an accessible and non-judgmental audience. But there is both danger and opportunity there.

If students rely solely on AI, they risk neglecting the important developmental process of learning to express vulnerability with real people, including their parents, teachers, peers, and mentors. However, AI can also serve as a transitional tool: a first step towards expressing emotions, practicing language or lowering the threshold for asking for help. In this sense, it can be part of a continuum that leads back to human contact, rather than away from it.

Teachers and therapists can support human connection

As AI becomes more embedded in everyday life, therapists and educators must grapple with its role. The question is not whether to embrace or reject AI, but how to integrate it with reason. Can we encourage students to use AI as a tool for self-discovery while guiding them toward authentic human relationships? Can AI support development tasks instead of replacing them?

It is clear that AI is already changing the landscape of communication, learning and therapy. The inner life of students is increasingly surrounded by the noise of algorithms. As adults, we should pay closer attention attentionnot just to emotional risks displacement and decreased authenticitybut also to the potential of AI to serve as a stepping stone to deeper connectivity, stabilityand growth.

If you or someone you love is having suicidal thoughts, get help right away. For 24/7 help, call 988 for the 988 Suicide & Crisis Lifeline or contact the crisis text line by texting TALK to 741741. To find a therapist near you, visit Current Psychology Therapy Directory.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *