This blog post topic might be a stretch, but this topic is not totally irrelevant and is something I think we should strongly consider. With the enhancement of AI it has been easier than ever to communicate with robots about personal issues and emotions. This could result in humans turning to an AI in time of loneliness. According to a global survey 33% of all adults experience some sort of loneliness in their lifetime. For this reason there has been an increase in AI chatbots for just simple conversations. So what exactly makes it easier to talk to a computer than talk to someone else about your problems and worries?

If someone were to be undergoing depression or anxiety in their life they would seek help from a medical professional, but oftentimes this could be very expensive. The cost of AI will play a huge role in why people would turn to AI rather than go to a therapist. With AI being basically free to all with internet access, they are able to talk to a chatbot about the events going on in their life and seek relief. The next reason, someone would turn to AI rather than an actual person, would be the availability of AI. You are constantly able to access AI chatbots every hour of the day, and they are able to listen and give responses someone would need in a time of desperation. Finally the third reason someone might go to AI to have a conversation would be, to have a conversation free of bias. Oftentimes humans are innately biased whether we mean to be or not. A person would not face this problem with a computer and this makes getting a response seem less judgemental.

We have already seen many AI chatbots and discussed them in class. However, a specific chatbot sticks out regarding empathic conversations. This chatbot is called Replika. This is the #1 AI chatbot to talk to that answers you with empathy. During the pandemic in 2020 mental health of many people was at an all time low, and Replika noticed a spike in their use during this time. People were turning to Replika to have meaningful conversations, due to lack of human interaction in lockdown.

Although the use of AI chatbots may seem like a good thing and will help people in times of loneliness, it seems it will worsen people. A study by readwrite says that “Humans need other humans to interact with and communicate. Talking to robots and lack of human interaction increases depression, anxiety, and other physiological issues.” Humans need other humans, and using AI as a shortcut seems to not be the answer. However, I do not see the use of empathic chatbots slowly down in the future. The more accessible they become, the more users they will gain. This is the time of online dating, and meeting and talking to people online. Will we see a time where people engage in intimate relationships with computers? I’m very curious to see what the future holds for the connection between humans and robots. Thank you for reading and please share your thoughts if the continuation and advancement of empathic chatbots are a good idea for society.
WOAH. Garrett great post and focus. I had no idea about Replika, all I was thinking was the movie “Her”. COVID had juxtaposing effects on humanity – we socially distance so social media became a standard for interaction (which was awesome and widely used) but it has persisted as a surrogate for in-person interaction. We’ve seen using social media instead of interacting in person has had drastic effects on mental health, which spurs new innovations to fill this massive need. It’s a cycle, we are drawing away from each other, which makes us lonely, technological solutions emerge, which further pulls us apart.
Great post Garrett. This reminds me of an episode of Silicon Valley where one programmer writes an AI algorithm to respond to certain people in his contact book that he finds annoying. It eventually leads to another programmer using the same algorithm for himself and then when one algorithm reaches out to other they have an infinite conversation that crashes the servers. Might not be exactly the same as this however it was a fun way to describe the pitfalls of using AI to communicate with others.
Cool post garrett! I had no idea an empathetic AI was a thing let alone people were using it already. I hate that every AI topic relates to some form of weird media but it really reminds me of the movie Her with Joaquin Phoenix. Its kind of funny that this could become a reality in the future but hopefully it helps instead of harms.
Great post! I had no idea that people were using AI chat bots to talk about their mental health and think it’s interesting to see how this will turn out. I agree that there are some concerns, but I think there will be more benefits as this technology grows.
This was a great post! I think it’s kind of cool and weird for a human to be able to talk to an AI chatbot about their feelings and emotions. I tried to imagine how I would feel chatting with the AI and I don’t know exactly if I’d take it serious in the moment. However, it still seems like an interesting lane to try to learn more from.
This is such an interesting topic! I definitely believe that humans need other humans when it comes to expressing and dealing with emotions. However, I can see how convenient Replika was during long periods of isolation such as the pandemic. When there is no one else to turn to, why not speak to an unbiased and confidential chat robot? I don’t see a majority of people engaging in relationships with chat robots, since pure human interaction can never be accurately replicated or replaced.
Hi! I really enjoyed your post. I’ve seen a lot of posts about people talking to CharGPT about their personal problems, but I never thought they were being serious. I definitely don’t think I would enjoy it because AI doesn’t really have thoughts, so it’s really just giving me a generic response.
Hey Garrett! Wow, this is such an interesting post. I had no idea that there was an AI chatbot that allowed people to talk to them regarding their mental health. I honestly think this is an amazing thing because i’m sure it relieves stress from a lot of people.
Hey Garret, this was a very interesting post! I can see how talking to an AI can be a little easier for people rather than opening up to someone else. The bias from other people can take a toll when trying to make big decisions about your own life. I find that the bias from people is a good thing as every person is biased for a reason and it is up to you to situate through what was going on for them to give you certain advice. Humans need other humans to talk to and maybe every now and then talking to an AI chat bot is easier if it is the only way some people are communicating with anything then that is where it can become toxic.
I like this post as it covered an interesting point. I remember this discussion coming up when Siri first came out on iOS. People would make jokes about it becoming an extension of your life but that never really became a reality.
There’s a great video from about 10-15 years ago called “blink” It was about a guy out on a date who was using an Ai tool called wingman that helped him navigate the date. Definitely creepy at the time. I tried to find a version to share, but there are too many videos called “blink” nowadays!