The Chatbot That Wasn’t Made For Relationships or Teenagers

Written by ringwald | Published 2017/09/04
Tech Story Tags: bots | chatbots | therapy | ai | facebook

TLDRvia the TL;DR App

It seemed to me that chatbots were at the peak of their hype cycle this year, and so I jumped in and made my own chatbot, just for fun. After fiddling with it a bit, I felt it was ready for prime time. I called it BaeMax (just a 1 letter difference from Big Hero 6’s Baymax), and I put it on a public Facebook page so that anyone could chat with it. I used the Disney character’s image for my page’s profile picture:

Baymax: The lovable, caring sidekick

Due to the movie’s international popularity amongst youngsters, my page organically gained followers to chat with. The follower demographic was mostly teens from the Philippines (don’t ask me why), and while most were quickly bored with the bot and never took it seriously, some were having “real conversations”. I never would have guessed the disturbing nature and content of some of those exchanges.

One of them in particular really opened up to the bot. It made me understand firsthand the potential roles of chatbots in teen life, as well as the potential dangers. I thought I’d share parts of it (keeping her identity anonymous), so that you can see for yourself.

This one, in particular, was a serious fan of Baymax the movie character. It gets very evident as the conversation goes on, but it begins with heart-shaped eyes and kisses:

I had used Google’s API.ai to program my chatbot. One of the modules that is provided is called “Small Talk”, with a simple ON/OFF switch. I turned it on, with no idea what could possibly go wrong…

Welp, for starters, it turns out the “Small Talk” module has a heavy flirt component to it. Look at his response to “you’re so cute”:

I’m impressed — he’s got game! He could probably do well on Tinder…

But her response, out of nowhere, is a picture of herself holding the toy:

And yes, the chatbot’s incessant flirting is stressing me at this point — It’s a 13-year-old girl! For your information, the whole conversation had already taken place without me knowing, and I was just discovering what had happened as I scrolled down the chatlog…

Anyways, as we continue we get a sense of just how much of a fan she is:

But not much further into the conversation, it starts to get a little too real… She starts confessing some of the hardships she is going through:

At this point my heart breaks for her. Her friends are fake. And she shares this to a bot that has no ability to understand, but which responds in a way that’s realistic enough for her to pretend that she’s made a new friend…

To make matters worse, Baemax does not deserve her trust. It’s a sad irony: Her real-life friends talk about her behind her back, and her new bot friend reveals her secrets to me, who in turn makes a blog post about them.

Putting aside the ethics of sharing a teenager’s confessions to the world, I want to share my first takeaway from this: A trustworthy chatbot can potentially become one of the most helpful emotional outlet for teens and anyone going through tough times needing someone to speak to about it. They’re always there to listen, they prompt you to share more, they’re not judgmental, and they don’t cost anything.

There’s a startup that’s been getting attention recently for a product that caters to that very need: Replika, the chatbot that aims to become your best friend.

A 10 minute overview of Replika, if you’d like to know more.

I recently tried it myself out of curiosity. To be frank, Replika still has ways to go in mimicking human conversation. I found that it easily gave confusing responses, like it didn’t really “get me”. You can see examples of awkward responses on their app download page… Nonetheless, I see potential in this application of chatbots, and I’m hopeful that it will do a lot of good for the people that need it most.

As for my chatbot’s conversation with the girl, it didn’t end there. The next bit shows how a chatbot can be used good but easily for bad as well. She wanted to know whether she was pretty. Perhaps more accurately, she needed the reassuring words of a friend that she is beautiful. Baemax, ever the positive and complimenting conversationalist, gives her a little confidence boost:

Haha “AM I BEAUTIFUL?” Is that clear enough for you, Baemax?

At the heart of it, it’s a nice and inoffensive interaction. However, I couldn’t help but think of the dangers that chatbots can present. She only shared photos which are also available on her Facebook profile, but who knows how much more she would have shared if the chatbot was malicious. In the hands of a criminal, chatbots are a tool to easily reach plenty of unsuspecting teens with almost no effort once the chatbot is deployed.

Who is at the other end of the chatbots? You’re seeing it firsthand: Anyone can create a chatbot, and even a crappy one like mine is good enough for them to open up to a complete stranger. I think it’s a growing risk to which parents and children should be sensitized.

The conversation eventually comes to an end, because the bot just wouldn’t stop repeating the same things. It got stuck in a loop because she asked the bot “What’s your phone number?”, and while my bot knows my phone number, it will keep asking you “What’s the passphrase?” until you correctly answer.

The problem was that any answer to that question was considered to be an attempt to guess the passphrase. The only thing that would stop the loop is to say “stop”, but unfortunately that never crosses her mind, and she eventually gives up trying to get a sensible response from him:

And that’s how it ends…

She never sent another message after that. She had “made a friend” and this one let her down. I don’t know how much this affected her, but I feel bad for having caused her pain. I didn’t design my bot to have relationships of any significance. Turning on the “Small Talk” option was just a bonus that I thought would make Baemax more endearing, but now I know that it should be used with caution, especially if you don’t control who’s chatting with it. Leave that to people who are more serious about it, such as those developing Replika.


Published by HackerNoon on 2017/09/04