Help college students emotionally before turning to AI

Some ChatGPT users ask chatbots for emotional support.
Justin Morrison / Photo Illustrations of Internal Premium ED | Kirillm/Istock/Getty Images
As more and more students are involved in generating artificial intelligence and chatbots, the way they use AI is changing. A 2025 report Harvard Business Review It was found that, according to social media discourse, “treatment/companionship” is the first use case for generating AI chatbots.
For the college counseling center, this change reflects students’ desire for immediate support. “This is not what a generation calls a consulting center and makes an appointment in four weeks later,” said Joy Himmell, director of consulting services at the University of Old Dominion. “They want to help.”
But for the counseling center, it is important to educate students on the risks of using generative AI tools for welfare support, Himmell said.
Research: While Chatgpt and similar text-generated chatbots are touted as productivity tools that speed up learning and workflows, some turn to them for personal and emotional support.
According to the 2024 security report, Openai found that some users experience anthropomorphism (human behaviors and characteristics of non-human entities and form social relationships with AI. The researchers hypothesized that human-style socialization using AI models may affect the way individuals interact with others and hinder the establishment of healthy relationship skills.
A 2025 study by MIT Media Lab and Open AI found that the massive use of Chatgpt is associated with an increased dependency on AI tools, with large users more likely to see Chatgpt as a “friend” and considering that face-to-face interactions are more comfortable than Chatgpt’s messages. However, the researchers noted that only a small percentage of CHATGPT users were affected by this or reported emotional distress due to overuse.
Another study from the same group found that higher daily use of Chatgpt was associated with increasing the tool’s loneliness, dependence and problematic use, as well as social interactions in other humans.
In extreme cases, individuals have established completely fabricated life and romantic relationships with AI, which can have deep feelings and real hurt when technology is updated.
This study shows that most people, even large users of Chatgpt, do not seek emotional support from chatbots, and do not rely on it. According to another survey, among college students, a few want AI to provide welfare support. A WGU Labs study found that 41% of online learners would be satisfied with AI, suggesting mental health strategies based on student data, while 38% said they would be a little or very uncomfortable with this use.
In higher education: On campus, Himmell sees more and more students starting to counsel their history of anxiety, depression and trauma. She said students were also lonely and were less likely to interact with peers on campus or participate in events.
Student mental health is the most important retention issue, but few counseling centers are able to provide one-to-one support for everyone who needs it. At her center, more and more students prefer in-person consultation sessions, which Himmell attributes to their desire to feel more rooted and connected. But many people are still involved in online or digital interventions.
Many universities have partnered with digital mental health service providers to complement in-person services, especially as the Covid-19 pandemic requires remote guidance. Such services may include counseling support or skills building education to reduce the need for intimate counseling.
Digital mental health resources cannot replace certain forms of treatment or risk assessment, but they can add to counseling sessions, Shimmel said. “Emotional intelligence automates AI systems to be able to convey some concepts and work with students, in a sense it can actually make counselors do such a thing. [skill building]so we can learn more about what we need to talk about. ” she explained.
Himmel said AI consultation or online engagement with CHATGPT is not the solution to all problems. For those who use chatbots as companions, “it builds a system that is not based on reality; it’s a facade,” Shimmel said. “Although even if this can achieve the goal in the long run, it’s really not great for the development of emotional or social skills.”
Faculty and staff need to learn how to identify students at risk of developing AI dependence. Compared to anxiety or depression that has more obvious clues in the classroom, “the symptoms associated with AI’s inner world rather than interacting with others in a useful way,” Himmell said. Campus stakeholders can pay attention to students who are socially or unwilling to work in group work to help determine social distancing and possible digital dependencies.
Consulting Center AI: Himmel said part of addressing student AI dependencies is getting familiar with these tools and helping students learn to use them appropriately. “We need to be able to take advantage of it and use it, don’t be afraid of it, and embrace it,” she said. She also saw the roles of others in counseling centers and higher education to provide additional education for AI in different formats and venues.
Old Dominion works with TalkCampus, the assistant provides 24-7 peer-based support. Consulting services are not automated, but the platform uses AI to mine data and identify risk factors that may arise in conversations and provide support if needed.