AI therapists could flatten humanity into patterns of prediction, and so sacrifice the intimate, individualized care that is expected of traditional human therapists. “The logic of PAI leads to a future where we may all find ourselves patients in an algorithmic asylum administered by digital wardens,” Oberhaus writes. “In the algorithmic asylum there is no need for bars on the window or white padded rooms because there is no possibility of escape. The asylum is already everywhere—in your homes and offices, schools and hospitals, courtrooms and barracks. Wherever there’s an internet connection, the asylum is waiting.”

A Critical Analysis of
AI Mental Health Treatment
Eoin Fullam
ROUTLEDGE, 2025
Eoin Fullam, a researcher who studies the intersection of technology and mental health, echoes some of the same concerns in Chatbot Therapy: A Critical Analysis of AI Mental Health Treatment. A heady academic primer, the book analyzes the assumptions underlying the automated treatments offered by AI chatbots and the way capitalist incentives could corrupt these kinds of tools.
Fullam observes that the capitalist mentality behind new technologies “often leads to questionable, illegitimate, and illegal business practices in which the customers’ interests are secondary to strategies of market dominance.”
That doesn’t mean that therapy-bot makers “will inevitably conduct nefarious activities contrary to the users’ interests in the pursuit of market dominance,” Fullam writes.
But he notes that the success of AI therapy depends on the inseparable impulses to make money and to heal people. In this logic, exploitation and therapy feed each other: Every digital therapy session generates data, and that data fuels the system that profits as unpaid users seek care. The more effective the therapy seems, the more the cycle entrenches itself, making it harder to distinguish between care and commodification. “The more the users benefit from the app in terms of its therapeutic or any other mental health intervention,” he writes, “the more they undergo exploitation.”
This sense of an economic and psychological ouroboros—the snake that eats its own tail—serves as a central metaphor in Sike, the debut novel from Fred Lunzer, an author with a research background in AI.
Described as a “story of boy meets girl meets AI psychotherapist,” Sike follows Adrian, a young Londoner who makes a living ghostwriting rap lyrics, in his romance with Maquie, a business professional with a knack for spotting lucrative technologies in the beta phase.

Fred Lunzer
CELADON BOOKS, 2025
The title refers to a splashy commercial AI therapist called Sike, uploaded into smart glasses, that Adrian uses to interrogate his myriad anxieties. “When I signed up to Sike, we set up my dashboard, a wide black panel like an airplane’s cockpit that showed my daily ‘vitals,’” Adrian narrates. “Sike can analyze the way you walk, the way you make eye contact, the stuff you talk about, the stuff you wear, how often you piss, shit, laugh, cry, kiss, lie, whine, and cough.”

