Your web browser is out of date. Update your browser for more security, speed and the best experience on this site.

Update your browser
by Dr. Nathaniel R. Strenger
| OLOGY

Therapy Should Not Be an Echo Chamber

Echo chamber pic

Chatbot therapists are actually nothing new. In the 1960s, Joseph Weizenbaum at the MIT Artificial Intelligence Lab created Eliza. It was an early computational model designed to emulate human interaction, in the form of a somewhat parodied Rogerian therapist, via text communication.

In the therapeutic tradition of Carl Rogers so popular at the time, Weizenbaum designed Eliza to mirror the inputs offered by her human interactant.[1] She restated and requested clarification as ways of conveying understanding. Eliza’s human counterpart might have initiated something like, “I am feeling especially lonely today.” Eliza’s reply: “I hear you saying that you feel lonely today. Tell me more about that.” The conversation might have gone on like that, and it gave the human interactant the remarkable sensation of being understood and validated.

Sherry Turkle, now a reputable member of MIT’s faculty in her own right, was among Weizenbaum’s students at the time. In several of her books, she recounts watching even knowing colleagues interact with Eliza as if the bot were sentient—that is, soulful and feeling. Topnotch lab students, fully aware that Eliza was acting a part, engaged in the back-and-forth with more and more heart. That gusto with which they spoke to the early chatbot alarmed Weizenbaum and Turkle alike. Why were so many educated, discerning people so willing to suspend their awareness for the sake of such an interaction with a digital actor? Turkle has now come along to style that human tendency to ascribe more sentience than is warranted as…the Eliza Effect.

I saw Eliza as a kind of Rorschach, the psychologist’s inkblot test. People used the program as a projective screen on which to express themselves. Yes, I thought, they engaged in personal conversations with Eliza in a spirit of ‘as if.’ They spoke as if someone were listening but knew they were their own audience. They became caught up in the exercise. They thought, I will talk to this program as if it were a person.”[2]

Now, generative artificial intelligence hits fever pitch with the Large Language Models (LLMs) like ChatGPT. Speculation quickly becomes reality, and the human impulse towards creativity stretches the potential applications of these LLMs into so many corners of life with ebullience. And the revolution’s unfolding impact on the ways we conceive of, organize, and execute mental health care is unmistakable. There are the welcomed improvements artificial intelligence—in generative forms certainly, but also in predictive functions—affords psychology. AI-assisted diagnostics, risk monitoring, and mass administration are already enhancing the efficiency and availability of mental health care. They can reach farther into suburban and rural communities, traditionally lacking in mental health care, and they make themselves available at any time and in just about any place. In short, they know very few limits as the human therapists do. These will no doubt be necessary tools as America handles the swelling wave of needs endemic to the time and culture. But what happens as the Eliza Effect begins to take hold and folks start developing ongoing, therapeutic relationships with chatbot counterparts? The public mental health apparatus has always done a far finer job protecting folks from the harms and tribulations imposed by a culture’s flaws than it has anticipating the cultural flaws created by its own methods. And so, while these technological advancements are proving helpful—miraculous even—we owe it to ourselves to step gingerly. Artificial Intelligence is already helping curate the so-called echo chambers all over our digital landscapes. What happens when they are allowed to do the same for therapy? Well, we do not really know. But here are some thoughts.

American loneliness is now well documented, and it is on the rise.[3] And folks are withdrawing from public (flesh-and-blood) spaces in droves, in part because they seek liberation from harmful cultural corners. American institutional participation, for instance, has declined steadily for decades. There are of course many reasons for this, but some commentators have pointed out corresponding rise of engagement in mental health services and social media alike. When folks spend less time together in public spaces, where do they go? Many, of course, go to therapy. More go online to their chosen echo chambers. There they can find the curated Rogerian feel-goods: acceptance, unconditional positive regard, and validation of their own personal perspectives. And it is there, too, that they can fall into cathartic rages with little regard for interpersonal consequence.[4] The social-media experience is not unlike the person-centered, Rogerian therapy experience in some important ways. The so-called person-centered therapies place a premium on clinical efforts that bolster the individual’s capacity for self-actualization and ego strength. And they account for therapeutic growth by the therapist’s creation of an almost-perfectly nurturing environment in which the patient can find the acceptance and validation needed to progress. But many critics, back in the day, sounded alarms on Rogerian models of therapy over socio-religious concerns. What happens, they asked, when person-centered techniques are deployed indiscriminately? When patients are nurtured to health by way of the feel-goods, learning to categorize any conflict or disagreement as psychologically or spiritually harmful? What happens when individual persons turn away from interpersonal moral commitments to find confidence only within their individual selves? In other words, what happens when therapy becomes an echo chamber? Well, we need only look to social media’s wake for an answer.

Now, I need to acknowledge the risk of disastrous oversimplification here, but there is a cultural insight worth considering. Psychotherapy—especially that Rogerian iteration branded by Eliza—offers a more individualized and idiosyncratically sensitive version of cultural care than do most public spaces. Now, I would suggest that this is a necessary social development, and psychotherapy serves as a crucial supplement to communal spaces like churches or mosques or synagogues.[5] But when they serve as replacement, there are unintended, and undesired, consequences. American life steadily drifts further and further out of shared communal spaces of shared reckoning and deeper and deeper into individually curated universes. As ideological silos form online, human tolerance for disagreement brittles, and violent polarization swells.

Now imagine what consequences might ensue if each individual patient forms a relationship with souped-up iterations of Eliza. If technology is able to provide an even more curated environment, individually attuned beyond what even the most empathic human therapist is able to supply, we risk molding our therapeutic spaces into problematic echo chambers that no longer teach folks to reckon with troubling neighbors but automatically withdraw from them.[6] All in the name of individual mental health, a session becomes just another space that weakens a person’s capacity to engage in healthy conflict, dish out as much empathy as she or he demands and sustain behavioral self-control in the face of drastic difference. Such a model reduces mental health to individual wellness. But more and more we must recognize that mental health means so much more. To be mentally well means also to be interpersonally well. And to be interpersonally well one must express the ability to remain sure in self while recognizing the dangerously different other. We can say confidently that the echo chambers online have sapped these capacities. And an unintended Eliza Effect could drain them further. Therapy should not be an echo chamber.



[1] The basics of Rogerian psychotherapy, while I think insufficient by themselves, are in fact the building blocks making up the foundations of most clinical approaches: Empathy, unconditional positive regard, and authenticity. These are often the very first interactional skills training clinicians learn to manifest before going on to master more sophisticated models of helping. Its simplicity makes it ripe for parody.

[2] This is taken from Turkle’s 2011 book, Alone Together: Why we Expect More from Technology and Less from Each Other, published by Basic Books.

[3] In fact, our latest OLOGY event sat down three psychologists, a rabbi, an imam, a priest, and a pastor to discuss American loneliness, its roots, and its social impacts.

[4] Interestingly, one 2021 study showed that humans were much more likely to express anger to a chatbot than to a human.

[5] I have taken a number of courses on the Psychology of Religion. I once asked a professor, “So if the church was providing the mental health care of old, why did we develop a need for psychologists?” My professor’s response: “Because churches do not always behave as they should.”

[6] Go watch Spike Jonze’s Her.

Other articles that may help
OLOGY

A Pentecostal Reflection from Vampire Weekend

Dr. Nathaniel R. Strenger
OLOGY

Freud and Pfister: A Love Story

Dr. Nathaniel R. Strenger
OLOGY

The Apartment

Dr. Nathaniel R. Strenger