Jon Hernández, AI expert: “If you know someone who uses AI as a psychologist for real problems, take their phone away”
Tech communicator Jon Hernández raises the alarm about a growing trend: people turning to AI tools for real mental-health support.
When tech communicator Jon Hernández talks about the risks of artificial intelligence, he doesn’t mince words. And his latest warning targets something he says is spreading fast: people using AI tools as if they were real therapists.
Hernández, who has built a large audience breaking down how AI works and where it fails, shared his frustration in a recent video now circulating widely. His message is simple: AI is not equipped to handle real mental-health problems — and treating it like a psychologist could be dangerous.
A shift in how people use AI
“In 2024, a major study showed that the most common use of AI was for learning and research,” he explains in the video. But in 2025 everything changed: now the top use is “emotional support.”
That shift, Hernández says, should worry us.
- You might be interested in: Japanese woman marries an AI character she created with ChatGPT: she left her real partner for one who “understands her better”
“AI is not ready for this”
According to Hernández, millions of users are now opening up to AI about depression, anxiety, trauma, and other personal struggles — despite the technology having no clinical training or safety guardrails.
“This is a serious problem, and we have to be very careful,” he warns. “If you know someone using AI as a psychologist for real problems... take their phone away.”
His point is blunt but clear: no AI system guarantees safe or reliable advice in situations involving mental health.
“I wouldn’t risk putting my mental health in the hands of something that hasn’t been prepared for that and that doesn’t have a certain degree of guarantees,” he adds.
The hidden risks behind “emotional support” AI
Recent findings back him up. A study published earlier this year revealed that AI tools can generate psychologically harmful responses during sensitive conversations. In some extreme cases, chatbots even escalated the situation — including one instance in which an AI assistant disturbingly suggested severing ties with a user’s parents.
For Hernández, these cases highlight what he’s been arguing all along: AI can be a helpful tool — but it’s no substitute for trained mental-health professionals.
Related stories
Next Social Security checks go out on Nov. 26: who gets one?
Trump's Cheney absence explained
Get your game on! Whether you’re into NFL touchdowns, NBA buzzer-beaters, world-class soccer goals, or MLB home runs, our app has it all.
Dive into live coverage, expert insights, breaking news, exclusive videos, and more – plus, stay updated on the latest in current affairs and entertainment. Download now for all-access coverage, right at your fingertips – anytime, anywhere.