BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

The War For The Future Of Psychotherapy

This article is more than 4 years old.

Maybe you heard. There’s a war brewing. At stake is the future of psychotherapy. Will manuals, algorithms, and automatons prevail? Or will we be able to preserve the centrality of unique and responsive human relationships? The fight will determine the kinds of care available to people with problems in living. And you better pay attention. This fight concerns you; after all, who doesn’t have such problems at some point in life, or care about someone who does?

I call the people fighting to remove the human from the psychotherapy equation “algorithm warriors.” I’m on the other side, standing with those defending psychotherapy as a professional practice ever more sharply focussed on the healing power of human relationships. I talked about this last weekend in a presentation to a conference organized by the Psychotherapy Action Network (PsiAN) titled “Advancing Psychotherapy for the Next Generation: Rehumanizing Mental Health Policy and Practice." What follows comes from that talk presented in San Francisco on 12/14/19.

I see two factions coalescing into an algorithmic alliance. The first is those technology entrepreneurs building, marketing, and envisioning products to replace rather than extend and enrich therapeutic relationships. In my PsiAN presentation I also noted there is lots of really wonderful entrepreneurial energy in the mental health space. Ours is a huge (estimates put it north of $300 billion a year) and profoundly dysfunctional mental healthcare delivery system. It needs all the help it can get. The need is great and potential profits and rewards astronomical.

Unfortunately, some are chasing profit not by trying to improve quality of care or access to that care. They are trying to disrupt and dehumanize the very definition of what psychotherapy is; who needs a therapist to be a person when you can have a cute chatbot trained to provide cognitive behavioral therapy (CBT). Often framed as a creative solution to the very real problems of access, solving those problems is not really their agenda. They are entrepreneurs, not altruistic policy-makers or healthcare providers trying to fix our badly broken mental healthcare delivery system. Quantity of profit, not quality of care is the goal. At the extreme you have those trying to fulfill the dream of artificial intimacy where a program provides a fully automated simulation of empathy and care, chatbots like yesterday’s ELIZA or today’s Woebot. When technology entrepreneurs cross this line and try to replace a human relationship of shared embodiment and consequence with an app, or act as though those relationships are replaceable, they join the alliance of algorithmic warriors.

The second faction in the algorithmic alliance are inside the profession; those reducing and then limiting psychotherapy to a set of procedures encoded in instruction manuals, i.e., to an algorithm. The group supporting instruction-manual therapies includes psychotherapy researchers designing treatment procedures based on how easily they can be studied under controlled circumstances. Rather than taking seriously the unique features of specifically psychological treatments, this research is built on a serious case of medicine-envy: if the drug companies do it then so too should we! The goal is not to help but be testable with a randomized controlled trial (RCT). Providing actual help to actual people under real word conditions is at best secondary. And, as other research shows, when tested in real world conditions these treatments often fail. The evidence for so called “evidence-based treatments” does not travel well into the real world of real people with problems in living. Its like a drug company testing a medication in a lab for a short period of time on paid volunteers in perfect health other than a pure form of the specific diagnosis being studied and then marketing that drug without seeing how well it works, or doesn’t, for people living in the real world.

These researchers are not alone, nor really to blame. They’re playing by the rules set by powerful forces. Consider media support. Instruction-manual based CBT (cognitive behavior therapy) is frequently called the “gold-standard” in various media reports. But this exalted status is as commonly invoked as it is undeserved, as I’ve written about as has real world research, a pair of recent books (Farhad Dalal and Enrico Gnaulati), and a special issue of the prestigious journal Psychotherapy.

Insurance companies are also an important member of an algorithmic alliance. Instruction-manual therapies are a bean counters dream! They routinely twist, and as found in the class action suit Wit v. UBH, distort research to limit reimbursement to specific acute symptoms. This lets them ignore underlying causes and chronic conditions. Imagine you go to your doctor with a high fever caused by a fulminating bacterial infection. If you were treated by the rules of managed mental healthcare you would merely be given an aspirin to reduce the acute symptom of fever rather than a more expensive antibiotic to address the underlying cause. Pretty crazy, I know. But, that’s what’s happening. By saying time-limited instruction-manual treatments are the generally accepted standard of care insurance companies get around complying with parity legislation that mandate mental illnesses and injuries be treated on par with physical ones. But the more algorithmic and the less human the treatment, the more profit.   

Professional organizations, like the APA (American Psychological Association) are another powerful member of the algorithmic alliance leading us to a future of dehumanized care and automated psychotherapy. In a mis-guided effort to have a seat at the table with insurance companies and policy-makers they are giving their imprimatur of respectability to instruction-manual treatments solely based on the number of RCTs done. As we’ll see in a moment, quality of care was made irrelevant relative to quantity of RCTs. 

For example, the APA crafted “clinical practice guidelines” for treating PTSD and depression based on research guidelines used for the evaluation of medications. In the process they have changed what people can expect from what’s called “evidence-based treatment.” When used by physicians and nurses, that term can be interpreted to mean the best possible rather than the most traditional care. But in psychology’s hands “evidence-based treatment” has been twisted and degraded to mean treatments with the most RCTs. While the APA tried to achieve what our medical cousins have achieved, they have instead, in my opinion, sold-out psychology.

Another speaker at the PsiAN conference took this up directly. Dr. Jonathan Shedler, formerly Associate Professor of Psychiatry at the University of Colorado School of Medicine, took a deep dive into the research behind the APA guidelines for depression and for PTSD. What he found is astonishing. About 2/3 of PTSD patients treated according to the algorithmic clinical guidelines publicized by the APA will still have PTSD at the conclusion of treatment. Depression is slightly worse. He noted that 70% of those treated according to those algorithmic practice guidelines either didn’t improve or quickly relapsed. 

What this means is that if you have PTSD or are depressed, the chances are pretty good that you will not (that’s a NOT) be helped by a clinician who follows APA practice guidelines. And the only way to understand this is to realize that even well-intentioned professional psychologists can become entrapped by the seductions algorithms afford. Instead of embracing our messy, fleshy all too human complexity, the APA has apparently thrown in with the alliance that views people as reliably programmable respondents to properly sequenced instructions: Who cares that the treatments don’t help most people! They are reliable and well-studied!

There is one more source of support for the algorithm warriors: you and me. That’s right. We just may be complicit by the way we’re sleepwalking towards a future of artificial intimacy. What I see as a dystopian future of fully dehumanized and automated psychotherapy will not come about solely because the algorithms and devices reach a level of sophistication so far only imagined in science fiction. It will happen because we step-by-sleepwalking-step reach a point of numbly accepting relationships with and through machines as routinely good-enough replacements for actually being together. To be clear, the antidote to this is not to turn our backs on technology. We should expect more and more from the tools we make. But we also have to start expecting more and more from each other, and ourselves. The answer is not less technology. The answer is more humanity; messy, fleshy, conflicted, complicated, horrendous, and wonderful other people. Only by learning to cherish each other more will we find ourselves in an anti-dystopian future. 

The fight is on. And learning to cherish that which can only happen when we are together is the only way to go.

Follow me on Twitter