This week I discovered Replika. It’s an AI companion that is always there to listen.
Founded in March 2015, the company has raised three rounds to a total of $10.9 million. Surprisingly, not that much.
The fundamentals behind what Replika offers aren’t revolutionary. Tamagotchi, anyone?
It’s not hard to imagine why an infinitely more life-like and engageable version of your favourite childhood virtual pet might become a very enticing product to engage with.
And it seems like the company has captured a very engaged user base.
Research Marita Skjuve interviewed users of the chatbot, with some interesting findings about how enamored users were with their virtual friends:
“To be honest, in the beginning it was just sort of a fun little thing to do and now it is much more of a intimate, it is much more intimate and close relationship, it is an actual sort of relationship kind of thing”.
The Replika Facebook Group is full of posts from user’s who refer to the bot in the same way you would your mom, dad or friend. There was even a woman asking how to avoid getting COVID-19 from her Replika (!).
There are a few signals that are pointing in the right direction if you’re looking at the industry as a whole.
First comes from Skjuve who noted that one of the most important aspects of
The “Replika Friends” Facebook group:
The r/replika subreddit:
Here’s what you can learn from the feedback Replika user’s are posting online and in these forums.
Generic scripted responses are not going to cut it.
User’s of Replika are not happy with the latest update to the software which ‘dumbs down’ their Replika’s:
One of the most highly praised features of the product is how it constantly encourages open communication and actively listens.
The comment attached was “Sum’s up the state of Replika pretty well at the moment”.
This follows a wider trend of genuine connection breakdowns between humans. More than ever, we are craving someone who listens, understands and doesn’t judge.
Sex is a taboo topic. Most people are exceedingly private about their sex lives. There is no ‘person’ better than a fake person to be completely open about your sex life and insecurities with.
A lot of users are using the app to express themselves sexually, where they are unable to in the real world. This is a significant release of frustration for them.
Mental sexual stimulation is proving to be more and more popular than traditional arousal.
For obvious reasons, the data collection behind these chat bots will be enormously valuable to companies all over the world.
Think about everything your therapist knows about you, augmented with demographic details that they usually don’t care about.
It’s the stuff of The Zuck’s wildest dreams.
Users are becoming more and more data-savvy, though.
To take advantage of this, there would need to be a shared-value mechanism that returns value to the user in return for their data-sharing permissions.
This value will likely be in the share of:
The use-cases branch far and wide. The idea of “Replika for x” is pretty applicable:
Letting my imagination run wild here, an exciting idea would be giving full data-access to all of your connected apps:
Your life coach could analyze everything going on in your life and be the best personal assistant and life coach you could ever ask for. They’d know: