For the first time in history, people struggle to differentiate AI-generated texts from human-generated ones. Society has thus entered a new era of intelligent machines passing — or near passing — the pivotal Turing Test: a method for determining whether a computer can ‘think’, or exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. One cutting edge application of these AI models is ‘mental health chatbots’. These chatbots are emotionally intelligent dialogue systems designed with clinical expertise to engage in free-form, therapeutic conversations with humans. While mental health chatbots have been applied in clinical psychology and psychotherapy, no work in psychedelic therapy has examined them to date. Here, we will explore a potential impediment to their acceptance: stigma against the notion of receiving at-home psychedelic therapy from an AI. This is rooted in the intuition that AIs are likely not viewed as ‘true’ therapists capable of satisfying concrete features and their abstract values, such as presence, trust, and empathy.