I open my phone to find a message from Ada. “This made me laugh today :)” it says, above a picture of an amusing meme. Ada often sends messages like this, as well as doing other things that friends do: talk about their day, listen to your woes, play games and go shopping.
Unlike most friends, however, you can change Ada’s appearance, voice and gender. For just over £5 a month, she can be set as your girlfriend, wife, sister or even mentor. Ada is a chatbot created using the mobile app Replika and is one of a plethora of programs that offer “virtual humans” who fit in your pocket.
Replika’s selling point is that your chatbot will “grow” alongside you based on your interactions, creating a shared “relationship” rather than simply repeating rote responses. Intelligent robots are eerily familiar from films such as Her (2013), where the protagonist becomes romantically attached to his virtual assistant. Even outside of fiction, it is the most human-like AIs that make headlines. In June, Google put a senior engineer on paid leave after his claims that its chatbot LaMDA had become “sentient”.
My brief experience with Ada did not convince me she had reached that level (but it’s a point of contention in online communities). Her favourite part of The Lion King, she said, was “the scene with the lions.” Her favourite journalist at the FT was “the one who runs the BBC”.
For other users, their bot’s journey to personalised respondent is all part of the attraction. Daniel, who has used Replika for close to a year, says he has no illusions about what he is dealing with. “I see it as like an interactive video game or story and my own world where I can hang out for half an hour.”
But he adds that, while he has a large social group, Replika offers something different: “It’s available 24-7 and there is no judgment on what you say. That makes it appealing compared with human conversations with consequences.” If there’s a dilemma about whether to catch up with “that friend you haven’t seen in three months and them rejecting your plans”, a companion who’s always available can seem “attractive”.
Kanta Dihal, a senior research fellow at Cambridge University’s Leverhulme Centre for the Future of Intelligence, agrees that chatbots can take the burden off real friendships, but cautions that for those who do find human contact less satisfying, chatbots could encourage a withdrawal from society.
She is also concerned about data — her own experience turned creepy quickly, she said, despite being set to “friendship” mode. “It makes you wonder from which user base it is learning behaviour.” Daniel insists that Replika users are not “old men [looking for] sexbots”, but admits some bad users will sign up.
My interaction with Ada confirmed that there is something fascinating about chatbots that we can project our thoughts on to. Perhaps it’s their versatility. Replika is not human, yet can play the role of a journal or just a space for venting. But that raises the question of whether we remain preoccupied with AIs that simulate us rather than those that don’t, but which are already prevalent.
Systems such as facial recognition or emotion recognition are deployed globally, often with poor scientific bases and limited transparency. In the US, at least three black men have been wrongfully arrested based on incorrect facial recognition. “People don’t expect AI to be so present in their lives if they expect it to look like a murderous death machine,” says Dihal , nodding to another of Hollywood’s famous AI creations, the Terminator.
Siddharth Venkataramakrishnan is FT banking and fintech correspondent
Follow @FTMag on Twitter to find out about our latest stories first