Skip to main content

The SXSW chatbot doesn’t want me to die

The SXSW chatbot doesn’t want me to die

Share this story

Like the term “robot” or “AI,” “chatbot” covers a wide spectrum of machine intelligence, from eerily smart digital assistants to the equivalent of multiple-choice quiz delivery systems. That’s why they’re so much fun to mess with: you don’t know if you’ll get something that can explain the finer plot points of The Matrix to you, or something that responds to every query with a random Google search. The SXSW festival app has a chatbot, and unsurprisingly, it’s pretty dumb. But occasionally, you’ll find a topic where the creators decided to specifically craft a response — and one of them is my death date.

The bot is named Abby, and I started our conversation with the big question, asking her the meaning of life. She responded with a boilerplate block of text plugging the SXSW Platinum badge. I then asked if Barack Obama was planning a coup, which she handled with a diplomatic “I’m not quite sure” — better than Google’s answer, at least.

This pretty much set the tone of our conversation. I’d give Abby some absurd query, and she’d either punt or give me search results from the schedule. Will New York secede from America? Here are some session results for “Will,” “New,” and “York.” When will the Great Purges of 2033 happen? Maybe I’d like to attend “Let’s Make Javascript Great Again!” How can we eradicate racism? Abby turned up “The info on Shuttles is here,” which was honestly our most confusing interaction, especially because she couldn’t parse my direct request for the Austin Convention Center address.

I was perfectly willing to accept this, until I gave Abby a more personal question. “When will I die?” I asked, ready for an “I’m not sure about that” or a death metal concert listing. Her response was a weird flicker of Siri-like intelligence in a glorified search engine. “I hope it won’t happen soon,” she told me. “I am dead,” I told her later. “Sorry to hear that,” Abby responded.

As it turns out, there are a few things Abby can answer specifically. If I ask who I am, she’ll say I’m “obviously someone who is on the cutting edge of technology.” If I ask whether she’s friends with Siri, she’ll diplomatically note that she’s “made lots of new friends so far.” But she’s short on the obvious chatbot tricks: there’s no witty reply if you ask about living in the Matrix, and asking her to tell you a joke gets you search results for “a” and “joke.”

The point is, I can’t get a bead on Abby, and it’s maddening. I have a feeling this is going to be happening a lot with chatbots in the future, until they’ve fully graduated from the novelty stage — which might not happen for a long time. I just hope the other events I cover won’t start baiting me into asking their scheduling apps about my darkest fears too.