Ph.D. student investigates mental health AI chatbots tailored to college students

Side-by-side headshots of a woman, left, with long, very dark brown hair and medium brown skin smiling into the camera (wearing a white shirt with black neck detail), and a man with short black hair, wearing glasses and a black button-up shirt, with a neutral expression.
Shahzadhi Nyakhar, left, and Hongwu Wang, Ph.D.

Shahzadhi Nyakhar began her doctoral studies at the University of Florida College of Public Health and Health Professions in 2022, a few months before OpenAI’s generative artificial intelligence chatbot ChatGPT was released to unprecedented success.

Nyakhar, now a public health Ph.D. candidate in social and behavioral sciences, remembers her peers recommending the chatbot for all manner of reasons.

Whether she needed a recipe or a sounding board for an idea, people were asking: Why don’t you just ChatGPT it?

The widespread use of the new technology sparked a thought for Nyakhar. She already knew she wanted to pursue innovative interventions for mental health and began wondering if tools like ChatGPT or Copilot, Microsoft’s generative AI chatbot, could lend a metaphorical hand.

A few years later, after taking the Robotics and Artificial Intelligence for Health course with Hongwu Wang, Ph.D., an assistant professor in the Department of Occupational Therapy, and with the knowledge that millions of people have already turned to AI chatbots for emotional support and mental health help, Nyakhar decided to pursue her question in earnest.

She recently published a paper, along with Wang, in the journal Frontiers in Psychiatry on preliminary research into the efficacy of the chatbots on mental health and well-being in college students, which is the basis for her dissertation. UF’s Artificial Intelligence Academic Initiative Center funded the research.

Nyakhar and Wang’s systematic review of more than 400 scholarly articles on the topic resulted in analysis of nine studies, with eight reporting statistically significant improvements in anxiety/depression, well-being and academic stress.

General-purpose generative AI chatbots were excluded from the review. The chatbots in the studies analyzed by Nyakhar and Wang — Woebot, Jibo, ARU, XiaoNan, Tess, Gloomy, Mind Tutor and Atena — were purpose-built mental health and well-being chatbots evaluated on college students.  

Limitations noted in the studies were the chatbots’ lack of emergency response protocols, which could put users at risk of undertreatment, and over-reliance on chatbots leading to delayed care from human providers.

Still, Nyakhar said, AI platforms may be able to evolve to work alongside existing counseling services for college students, especially when access to therapy is limited.

“Sometimes even getting connected to a therapist could take months, and services can be costly,” she said. “Accessibility, cost, these are all barriers for college students when seeking care.”

Nyakhar is pursuing additional research on AI-based approaches to supporting student well-being, with attention to implementation and safety considerations.

“The next phase of my research is to use this information to better understand how generative AI platforms can be designed and used more safely,” she said. “That includes informing stronger safeguards and exploring if and how these tools can be responsibly integrated into well-being interventions on college campuses.”