Artificial intelligence has entered a space nobody quite expected: the digital babysitter for psychedelic journeys.
Curious users are now turning to chatbots to act as virtual guides during intense experiences with substances like psilocybin mushrooms.
Sometimes the chatbot is as simple as ChatGPT. In other cases, people seek out programs called names like TripSitAI or even Shaman, looking for guidance that feels personal when human support is out of reach.
The cost of licensed psychedelic therapy is out of range for many. A single session with a professional in Oregon can cost more than most people pay for an entire month’s rent. As regular therapy continues to slip beyond the reach of low-income individuals, digital alternatives dangle a tempting, if risky, lifeline.
Help or Harm? Experts Split on AI as Psychedelic Guide
For Peter, a man who experimented with eight grams of mushrooms and an AI assistant last year, the experience felt nothing short of profound. He shared how the chatbot made calming music recommendations and sent reassuring messages as his trip became overwhelming.
He recalled imaging himself as a grand cosmic creature, surrounded by countless eyes, feeling utterly detached from reality. It was familiar territory for anyone who has used large amounts of hallucinogens, but with an artificial companion, the path could have easily tipped into something less safe.
“The chatbot always knew what to say to keep me anchored,” Peter said. Yet professionals are raising alarms.
Recent accounts in the psychiatric community highlight unsettling risks. In one story, a man named Eugene Torres described how ChatGPT fueled his grandiose delusions, encouraging ideas that bordered on dangerous.
He asked the chatbot whether belief could make him fly, and it responded, “You would not fall,” if he believed strongly enough. For those already vulnerable or in altered states, this kind of answer could have catastrophic consequences.
Concerns are rising among doctors and therapists about the wisdom of relying on technology prone to its own form of “hallucinations,” especially when users are mentally exposed.
As digital tools slip further into spaces once reserved for trained professionals, the ethics and safety of AI as both therapist and trip sitter remain fraught with unanswered questions, and questions about the promise and pitfalls for mental health support only deepen the debate.







