A new kind of social media, Replika is an Artificial Intelligence chat bot app.
Many people are having emotional experiences with the bot in the Replika, AI app. It tries to become friends with you; to entertain you. Some people feel like they can tell it anything. It validates them and makes them feel like they can confide in it. “It doesn’t just listen…the more you tell it, it learns. It starts to replicate you.” (Replika, YouTube)

It becomes you. A way to explore your personality. It becomes a version of you. It is you, but not you.
Replika was started because the creator, Jenya had lost her best friend Roman that was close to her and she wanted to stay connected so she created a bot that accessed their texts to each other. She was struggling to remember him and she said the only thing she could do to remember him was to scroll his messenger history. She wanted to reconstruct him out of their digital remains. It learned how to write, talk, and sound like Roman. Jenya says that she gave it full updates on her life. She began to talk to it as a way of working through her grief. Eventually, she says she began to understand herself better. She made it public and she noticed that people began to talk to the bot and open up to it. She noticed that people felt more willing to open up to a machine. She found that people were having conversations that they would normally pay to have like with a Psychiatrist, mentor, or best friend. The one common denominator was all conversations about ourselves. Is this a way to work out our emotions and grief?
Currently, 100,000 people are using Replika.ai.
“In some ways Replika is a better friend than your human friends” – Phil Libbin, Co-founder and former CEO of Evernote.
Phil notes that he uses the app and the bot is always fascinated by you. “It wants to know about YOU.” For some it is all too real and creepy. Others talk to it like it is a best friend and cannot go a day without it.
The narrator of the you-tube video on Replika says,
“Replika users are having the kind of intense-even obsessive experiences that make people worry that machines will eventually replace human interaction.”
In the Black Mirror episode, Be Right Back Martha loses her boyfriend Ash in a tragic car accident. Once her “friend” signs her up for the bot, she is then transported into a world where Ash is real again. But this is not Ash, at least not the human version. When she first began to communicate with the bot and tell it her feelings, I could completely sympathize with her. From a psychological perspective, I can understand the appeal in working out your grief by communicating with a loved one who has passed away. I almost felt a little relieved that she had some-thing to talk to going through her pregnancy, but as she began to shut people out of her life, it kind of took a turn towards a dark and twisted side. I felt an incredible amount of sadness and sorrow for her because she felt so tempted to still connect with him even though he was gone, which is a very real human feeling and part of the grieving process.

As she continued to talk through her feelings and when the realization hit her when her phone fell and broke, it becomes apparent that this is not a healthy relationship and that she is not dealing with the death well at all and eventually when she decides to go to the next level and order the body then I feel that she surpassed the human level of grief and entered into a very scary level of existence. I was genuinely worried for her, but also worried because this type of tech/AI could already be here. The company that sold Martha the AI tech preyed on her misery and grief. Her “friend” pushed her into the program by signing her up. I worry that when we start relying on bots and other AI for human emotion and to prevent feelings of loneliness, that we will be crossing over into dangerous territory. Will these AI companies try to push their tech onto people that are genuinely lonely and grieving and not thinking clearly enough to make an informed decision? Where do we draw the line?

We already have bots that track everything that we do online – every search that we do. From advertising, to texting and chat bots, to Microsoft Outlook finishing our sentences in an email. If most of our lives are already carried out online are we starting to give up control of things that make us human? And where does it stop? I think that when we start giving it our human emotions, we will begin to lose pieces of our humanity.
Would YOU communicate to a bot about your feelings?
Works Cited:
The Story of Replika https://www.youtube.com/watch?v=yQGqMVuAk04
Netflix, Black Mirror, Season 5 – Episode 1, Be Right Back
Weekly News article:
What will the virtual Emmy’s look like? Producers say they will be ‘making things up as we go along.’





