FAke Friends

The school community discusses the impact of Snapchat’s new “My AI” chatbot on teenage social interactions and friendships.

Ella Yadegar, Assistant Features Editor

As Ellie Whang ’24 checked her Snapchat one morning in April, she was surprised to see a clothed, anthropomorphic chatbot named “My AI” pinned to the top of her chat feed. She had experimented with artificial intelligence before, trying programs like ChatGPT. But Whang said seeing the integration of Artificial Intelligence (AI) on Snapchat, an app she exclusively viewed as a form of social media to communicate with friends, made her worry about students’ social interactions.

“If people have access to all these different kinds of AI, they can ask it whatever they want,” Whang said. “That is kind of how AI is advertised. Ask it anything, and it will give you an answer. Then it will start limiting human interaction, and that is kind of just where we are going. People can rely more on AI instead of their friends or other people.”

After only being accessible to Snapchat+ subscribers for $3.99 a month, Snapchat released My AI for free for all users in April. The chatbot appears as a customizable avatar and can access users’ locations if the app’s location services are turned on, according to Snapchat Support.

Josh Siegel, the Director of Consumer Product Management at Snapchat, said in an email that his department found incorporating AI allowed Snapchat to utilize new technology to support the platform’s existing features.

“Messaging and communication is a core daily behavior on Snapchat, whether by text or visually,” Siegel said. “U.S. Snapchatters open the app nearly 40 times a day, and our global community creates over five billion Snaps daily. AI fits seamlessly into this core product value, and we are finding new and exciting ways for it to surface Snapchat content and power fun and useful interactions for our community.”

Snapchat is open about My AI’s faults, with Snapchat Support claiming  that because AI is an evolving technology: “it’s possible My AI’s responses may include biased, incorrect, harmful or misleading content.” 

While experimenting with My AI, Ofek Levy ’23 said he noticed inconsistencies in the technology’s responses.

“I saw this thing on Tik Tok where I saw someone sending [My AI] pictures, and it kept saying, ‘I am glad you are having a fun day in the sun’ or ‘I love that shirt,’ and when you asked it if it can see, it would say, ‘no, Snapchat AI cannot see,’” Levy said. The weirdest part of it is that it has these responses that shift dramatically depending on your responses to it. To some degree, that is the purpose of AI, but when it is lying about what it can and cannot do, it seems really sketchy.”

In one instance, Levy said he experienced the chatbot being biased against some races or religions.

“I had a friend messaging it and talking about the Holocaust,” Levy said. “And [the My AI] was like, ‘we do not talk about that here’ and ‘that is so inappropriate.’ My friend told it she was Jewish and the AI responded, ‘Oh, my God, I’m so sorry.’”

Although she believes Snapchat’s My AI will provide comfort to struggling teenagers, Addison Carson ’25 said the AI would not be able to empathize with people the same way another human could.

“These robots are made to have specific scripts and things to say, which is only good when someone is in need of basic sympathy,” Carson said. “It is a helpful tool for people who are just looking for someone to rant to or if they are feeling lonely. It can become detrimental when people begin to rely on the company of Snapchat’s AI because it is definitely not the same as human interaction.”

Carson said she is currently in training to respond to texts, emails and eventually phone calls from people reaching out to Teen Line, a hotline run by professionally-trained teenage advisers who provide mental health support and resources to youth around the world.

“The main things we have focused on in training are active listening and empathy skills,” Carson said. “We also focus on learning about different mental health challenges and how to comfort and support people that may be struggling with different things in their lives. If you are looking for advice or more than only surface-level sympathy, then My AI would not be sufficient.”

Whang, who also volunteers at Teen Line, said after interacting with the chatbot herself, she realized it is not a reliable place for people struggling with mental health issues to turn to.

“I started asking it a bunch of simple questions like ‘What’s your favorite song?’ or ‘What’s your favorite color?’” Whang said. “And then I wanted to really test its boundaries and see the extent of things that I could say to it. I remembered we were talking about AI at Teen Line with the adult supervisors, so I decided to ask it what I should do if I am depressed. It popped in the chat and then just left without giving me any answer at all.” 

Whang said although she recognized the chatbot is not designed to be an expert in mental health, it can still be useful in certain circumstances. 

“I text it for advice that I am embarrassed to ask my friends about,” Whang said. “I will explain a hypothetical situation that is actually going on in my life to try and gauge what suggestions it will give me.” 

Siegel said the company created its AI chatbot with the intention of humanizing it so it could better connect with users. 

“My AI is here to help you connect with friends, learn about the world and just have fun,” Siegel said. “It was trained to have a unique tone and personality that plays into Snapchat’s core values around friendship, learning and fun. We’ll learn a lot from the way our community engages with My AI to shape what comes next.”

Some students expressed concerns about the information that My AI collects from users, which Snapchat  stores as data. Sophia Vourakis ’24 said she is paranoid about having AI on Snapchat, which she previously reserved for communicating with friends.

“The first time you open the chat, it has a little pop-up that asks you to accept its [terms and conditions],” Vourakis said. “I know that every single app steals your data. I’m not super paranoid about things stealing my information because that is everything now, but I still think it is a little weird that you have to click accept. It is kind of odd in the first place that [Snapchat] thought it would be neat to have people talk to a chatbot. The purpose of Snapchat is to talk to other real people and not robots.”

Levy said it is important for teenagers who use Snapchat to be aware of what they are using My AI for.

“I understand why people would want to use it to have a good laugh,” Levy said. “And I think that’s ultimately what it will be used for. But there will be certain cases where it is used inappropriately. There will be people who will use it as more of a friend than anything else. The fact that Snapchat has tried to make it so that it seems to hold that space for a replacement friend is what makes it dangerous.”