The United States is within the grips of a loneliness epidemic: Since 2018, about half the population has reported that it has experienced loneliness. Loneliness could be as dangerous to your health as smoking 15 cigarettes a day, in accordance with a 2023 surgeon general’s report.

It shouldn’t be just individual lives which might be in danger. Democracy requires the capability to feel connected to other residents so as to work toward collective solutions.

In the face of this crisis, tech firms offer a technological cure: emotionally intelligent chatbots. These digital friends, they are saying, may also help alleviate the loneliness that threatens individual and national health.

But because the pandemic showed, technology alone shouldn’t be sufficient to handle the complexities of public health. Science can produce miraculous vaccines, but when persons are enmeshed in cultural and historical narratives that prevent them from taking the life-saving medicine, the cure sits on shelves and lives are lost. The humanities, with their expertise in human culture, history and literature, can play a key role in preparing society for the ways in which AI might help – or harm – the capability for meaningful human connection.

The power of stories to each predict and influence human behavior has long been validated by scientific research. Numerous studies exhibit that the stories people embrace heavily influence the alternatives they make, starting from the vacations they plan, to how people approach climate change to the pc programming decisions security experts make.

Two tales

There are two storylines that address people’s likely behaviors within the face of the unknown territory of depending on AI for emotional sustenance: one which guarantees love and connection, and a second that warns of dehumanizing subjugation.

The first story, typically told by software designers and AI firms, urges people to say “I do” to AI and embrace bespoke friendship programmed in your behalf. AI company Replika, as an example, guarantees that it could provide everyone with a “companion who cares. Always here to listen and talk. Always in your side.”

There is a worldwide appetite for such digital companionship. Microsoft’s digital chatbot Xiaoice has a worldwide fan base of over 660 million people, lots of whom consider the chatbot “a pricey friend,” even a trusted confidante.

In the film “Her,” the protagonist develops a romantic relationship with a classy AI chatbot.

In popular culture, movies like “Her” depict lonely people becoming deeply attached to their digital assistants. For many, having a “dear friend” programmed to avoid difficult questions and demands looks as if an enormous improvement over the messy, difficult, vulnerable work of engaging with a human partner, especially in the event you consider the misogynistic preference for submissive, sycophantic companions.

To make certain, imagining a chummy relationship with a chatbot offers a sunnier set of possibilities than the apocalyptic narratives of slavery and subjugation which have dominated storytelling a few possible future amongst social robots. Blockbuster movies like “The Matrix” and the “The Terminator” have depicted hellscapes where humans are enslaved by sentient AI. Other narratives featured in movies like “The Creator” and “Blade Runner” imagine the roles reversed and invite viewers to sympathize with AI beings who’re oppressed by humans.

One reality

You might be forgiven for considering that these two stories, one in every of friendship, the opposite of slavery, simply represent two extremes in human nature. From this angle it looks as if thing that marketing messages about AI are guiding people toward the sunny side of the futuristic street. But in the event you consider the work of scholars who’ve studied slavery within the U.S., it becomes frighteningly clear that these two stories – one in every of purchased friendship and one in every of enslavement and exploitation – will not be as far apart as you would possibly imagine.

Chattel slavery within the U.S. was a brutal system designed to extract labor through violent and dehumanizing means. To sustain the system, nevertheless, an intricate emotional landscape was designed to maintain the enslavers self-satisfied. “Gone with the Wind” is probably essentially the most famous depiction of how enslavers saw themselves as benevolent patriarchs and compelled enslaved people to bolster this fiction through cheerful professions of affection.

In his 1845 autobiography, Frederick Douglass described a tragic occasion when an enslaved man, asked about his situation, truthfully replied that he was ill-treated. The plantation owner, confronted with testimony in regards to the harm he was inflicting, sold the truth-teller down the river. Such cruelty, Douglass insisted, was the essential penalty for somebody who committed the sin “of telling the straightforward truth” to a person whose emotional calibration required constant reassurance.

‘Uncle Tom’s Cabin,’ a Nineteenth-century blockbuster novel, featured an enslaved man who professed unwavering love for his enslavers.
The British Museum, CC BY-NC-SA

History lesson

To be clear, I’m not evoking the emotional coercion that enslavement required so as to conflate lonely seniors with evil plantation owners, or worse still, to equate computer code with enslaved human beings. There is little danger that AI companions will courageously tell us truths that we’d reasonably not hear. That is precisely the issue. My concern shouldn’t be that individuals will harm sentient robots. I fear how humans will probably be damaged by the moral vacuum created when their primary social contacts are designed solely to serve the emotional needs of the “user.”

At a time when humanities scholarship may also help guide society within the emerging age of AI, it’s being suppressed and devalued. Diminishing the humanities risks denying people access to their very own history. That ignorance renders people ill-equipped to withstand marketers’ assurances that there isn’t a harm in buying “friends.” People are cut off from the wisdom that surfaces in stories that warn of the moral rot that accompanies unchecked power.

If you rid yourself of the vulnerability born of reaching out to a different human whose response you can not control, you lose the capability to totally care for one more and to know yourself. As we navigate the uncharted waters of AI and its role in our lives, it’s necessary to not forget the poetry, philosophy and storytelling that remind us that human connection is imagined to require something of us, and that it’s definitely worth the effort.

This article was originally published at