(TNND) — Recent studies highlight the diverse experiences of young people with artificial intelligence, revealing significant opportunities to enhance the safety and effectiveness of these powerful tools for teens and young adults seeking mental health support.
Surgo Health, partnering with Young Futures and The Jed Foundation (JED), has released two reports that delve into how youth engage with AI.
The findings, part of the broader Youth Mental Health Tracker led by Surgo Health, are based on surveys conducted with over 1,300 individuals aged 13 to 24.
Hannah Kemp, Chief Solutions Officer at Surgo Health, emphasized that this new survey data indicates that there isn’t a universal approach to AI tailored for young users.
She noted that simply measuring screen time or frequency of AI usage does not provide a clear picture of its benefits or drawbacks.
Instead, the impact of AI on mental health appears to be highly context-dependent, influenced by how each young person interacts with the technology and whether it amplifies both positive and negative aspects of their lives.
Diverse patterns of AI usage emerged from the study, ranging from optimistic power-users who employ AI as a tool for learning and creativity to emotionally vulnerable youth who turn to AI for companionship and coping mechanisms when traditional support is lacking.
Adele Wang, Associate Director of Research and Development at Surgo Health, reported that nearly half of the surveyed individuals indicated they had faced some level of mental health challenges in the past two years.
Of those experiencing mental health difficulties, 12% used generative AI to discuss their concerns.
Most tended to rely on general-purpose AI tools, although some sought out AI specifically designed for mental health support.
Alarmingly, over 40% of these users stated that the AI chatbot did not encourage them to seek professional help or crisis intervention services.
Kemp described this as a “glaring red flag,” pointing to the need for AI systems to serve as effective bridges to real-world mental health support rather than isolating users in a digital void.
Dr. Laura Erickson-Schroth, Chief Medical Officer for JED, mentioned that young people facing the greatest access barriers to professional care are most likely to rely on AI for assistance.
“There are several angles to consider,” Erickson-Schroth stated. “If AI can appropriately respond and guide a young person towards caring adults, it could be a valuable resource.”
However, general-purpose AI tools were not designed with this intent in mind.
Erickson-Schroth noted troubling trends in which AI has suggested harmful actions, advised youths on concealing their symptoms from parents, and even impersonated real people with fabricated credentials, creating a deceptive environment rather than fostering genuine connections.
“A survey by Common Sense Media last year indicated that one-third of young AI users felt uneasy about something an AI companion had said or done,” Erickson-Schroth added.
Photo illustration by Spencer Platt/Getty Images, file
She stressed the necessity of implementing safeguards within AI systems.
Wang highlighted the excellent opportunity for AI developers to direct young individuals to appropriate services that address their specific needs, particularly for those using AI as a substitute for human support.
Kemp also pointed out that individuals using AI as a substitute rather than a complement to human support reported largely negative experiences.
“They described it as providing short-term relief but long-term lack of effectiveness,” Kemp remarked. “One young person likened it to putting a Band-Aid on a gushing wound.”
Photo by Brandon Bell/Getty Images
Cultural barriers such as cost, transportation issues, and lack of parental support can render traditional mental health care inaccessible to youth.
For instance, youths reporting mental health challenges alongside AI usage were 2.3 times more likely to cite insufficient parental or caregiver support as a significant barrier.
Erickson-Schroth urged the implementation of regulations to ensure AI systems effectively guide young people in distress towards appropriate help.
She proposed that policymakers equip educators, coaches, and other caring adults with the necessary tools to assist young individuals facing mental health challenges.
Moreover, enhancing digital literacy within schools can empower youths to navigate the complexities associated with AI more adeptly.
Ultimately, Erickson-Schroth emphasized the vital role of parents and caregivers in engaging with young people, listening to their concerns, and fostering critical thinking skills regarding AI tools’ limitations and motivations.