Strange times, indeed, folks. Strange times.

Should we be concerned? I think so. In the end, I am sure it will work out, but there will likely be a lot of time between now and the end. So, we will have to endure some pretty radical changes during that time. And many of these changes will not be very pretty.

So, what do I have my shrew ire up about now? Nothing new, really.

I wrote a rather tongue-in-cheek article a while back called “My Brief Love Affair with a ChatBot.” This current article isn’t so light-hearted. Don’t get me wrong, considering the title here, I am not concerned about losing my career to AI. I am about done with being a regulated psychotherapist, anyway. I will practice until I am dead, but only with a select few who still treasure a human-to-human relationship with their therapist. ChatBots may very well wipe out this profession, but there will always be a few people who simply will not go for being counselled by a robot. I’m not worried for myself. I am, however, worried about the human race in general.

It is interesting to me how I really don’t give much thought about robots (including AI) taking over human jobs. Technological progress has been doing that consistently since humans started walking on two legs. There isn’t much we can do about that—although we certainly could deal with it in a more humane way than we have in the past, but I’m not holding my breath.

Generally, we roll with it, and those people knocked out of a job due to advances in technology get new training and start something new, or retire; they don’t typically hang themselves from the nearest railing—nothing all that serious. We roll with it. What I do seem to be concerned about these days is technology wiping out humanity. AI and robots replacing deeply human things like art, literature, music, and the topic of this article, psychotherapy (among other human things), has reason to concern me. Not because it is my profession and I would be the one replaced, but because psychotherapy is a deeply human activity, and if people are daft enough to turn to a robot for therapy, we are headed for the endgame. And they will do just that (turn to robots for therapy), mark my words.

Why?

Well, there are a few reasons. One big reason is that few people know what makes psychotherapy therapeutic. It isn’t the “head stuff”—it isn’t advice on how to fix a crappy marriage, or how to effectively deal with in-laws, or how to teach your kids a lesson or two. It isn’t instructions on how to ask a girl for a date, or how to tell your male partner you are not going to take his abuse anymore. Sure, there are some psychotherapy modalities that preach the efficacy of these top-down methods (like CBT, Cognitive Behavioural Therapy) and the methods are not wholly ineffective.

Although even practitioners may believe these interventions are 100% sound therapeutic practices, they’re not. What makes therapy therapy is two human beings conversing, and one of them is unbiased and willing to accept (not agree that it is best) whatever the other one is sharing, with true empathy and compassion. That’s it. And ChatGPT can’t do that.