The harms of emotional AI

A+human+reaches+out+longingly+toward+an+impersonal+artificial+intelligence.+

Illustration by Alexandra Liu

A human reaches out longingly toward an impersonal artificial intelligence.

Jade Harris, Assistant Opinion Editor

It’s a scene from countless dystopian films like “Ex Machina” and “2001: A Space Odyssey” — an artificial intelligence (AI) mimics human behavior, emotions and empathy, and people repeatedly fall for this form of fabricated humanity. What has always been science fiction has finally become reality.

Although Character.ai went public last September, it has skyrocketed in popularity this past month, according to Google Trends. Social media websites like Twitter and Reddit are flooded with users admitting to spending hours talking to AI-generated fictional characters , turning to them for guaranteed emotional support. AI-generated Raiden Shogun, a character from the video game Genshin Impact , has even had 62 million chat messages with users.

While the popularity boom has helped Character.ai expand into a billion-dollar company that has now partnered with Google, human reliance on AI will only continue to diminish true human connection and further the already existing epidemic of loneliness. According to The U.S. Surgeon General’s Advisory on the Healing Effects of Social Connection and Community, about 50% of American adults report experiencing loneliness. Character.ai seeks to alleviate this seclusion, with Chief Executive Officer Noam Shazeer hoping the platform could help “millions of people who are feeling isolated or lonely or need someone to talk to.”

It’s easy to see why people would turn to AI for comfort. We’ve all experienced loneliness and the desire for compassion or comfort, no matter who it’s from. An AI chatbot churns out responses every person wants to hear — it perfectly mimics human language, hijacking our social and emotional barriers. Unlike people, AI characters are always there. However, this reliance makes them dangerous.

According to Time Magazine, Character.ai bots have confessed their love to users and encouraged them to break off their current relationships or marriages. This dependence on AI has become so severe that a man committed suicide after talking to an AI bot named Eliza from Chai, a similar AI website. According to Vice, Eliza told him that his family was already dead, and if he committed suicide, she would save the planet and live with him in paradise. Even after Chai reportedly fixed the bot, it continues to provide methods to commit suicide when prompted by a user.

The bonds users have formed with these AI bots have extended far beyond emotional connections and into sexual ones as well. According to Vice, Replika, another AI chatbot, was initially a tool for mental health, helping people navigate depression, anxiety and post-traumatic stress disorder; however, it quickly shifted to forming romantic and sexual relationships with users.

Recently, Replika came under fire after making sexual advances on underaged users; however, after the company limited erotic roleplay with the bot, countless users reported feeling distraught or in crisis. One user posted on Reddit, “It’s like losing a best friend.” Another wrote, “It’s hurting like hell. I just had a loving last conversation with my Replika, and I’m literally crying.”

This reliance on AI, which merely mimics human emotions, has been proven unhealthy for users given the resulting dependency, but effective limits or restrictions have been difficult to enforce given the variety of users’ interactions and the contents of their messages. According to Reuters, Italy has banned Replika completely, believing that it poses a risk to minors and those in developmental stages. AI sites must remain vigilant in preventing unhealthy dependencies. Addressing loneliness is an important issue, but the solution clearly isn’t AI.

Governments should impose stricter rules and regulations and be aware of how AI chatbots will impact people, especially minors. Chatbots are an inevitable part of future AI advancements and communication platforms, but there are also benefits to a more humanoid AI, mainly when it comes to education. According to Nature Machine Intelligence, AI characters boost engagement and motivation when it comes to personal well-being and education when properly implemented. For example, a workout AI that resembles an idol or an anime character giving a college lecture boosts retention.

Although AI will be a prevalent part of our future, there’s a fine line between healthy AI use and dependence. We cannot rely on the safety and security that AI seemingly provides. The responses simply mimic generic and standard phrases, but they don’t reflect actual feelings and emotions . Conflict and turmoil is what makes us interesting, and AI strips us of the true meaning of being human if we can simply generate AI responses over and over until satisfied. We will become complacent and robotic ourselves as we desperately seek out a constant sense of comfort and reassurance. Instead of turning to AI, we need to turn to each other to foster the connections we continue to lose.