Microsoft’s Tay Avatar

Tay was Microsoft’s online chatterbot or chatbot, an AI software module that could interact with users in their natural language.  However, it was discontinued after only sixteen hours when it was found to be making foul, racist, sexist remarks. That was the result of interactions with mean-spirited internet trolls who deliberately taught it to speak that way. Tay was killed by Microsoft in March of 2016.

The successor to Tay, named Zo, appeared in December of 2016 but could be used by invitation only. Users are vetted before being allowed to interact with Zo. That rule was designed to eliminate the trolls by taking away the anonymity they hide under. I haven’t tried Zo myself but online reports make it look competent, if unimpressive.

In my novel, NODs, the chatbots are called NODs, part of the Network Of Devices. They were initially designed to chat with users about the status of their home’s climate-control systems (heating, cooling, leaks, costs, optimization, and other exciting stuff). The NODs eventually learned from interacting with people how to do other things, like block annoying online advertising. Mayhem ensued.  The important part of the story though is the machine learning – the fact that the NODs developed agency that was never programmed into them.  It’s the old “Frankenstein syndrome.”

I hadn’t heard of Tay when I wrote NODs, and I never even considered the problem of online trolls corrupting a chatbot.  Still, I was delighted to learn of Microsoft’s troubles fielding a general purpose online AI chatbot, not because I wish them ill, but because I had anticipated some of the same issues in my novel.  My best wishes to Zo, but I’d keep it away from my thermostat.

The important story, from a psi-fi perspective, is not that chatbots exist, and not even that they might “go rogue.” Rather, it’s about the hubris of the designers, who, like Victor Frankenstein, think they have control over their AI inventions.  I guess a psi-fi story could be written about the trolls too, since they obviously have pronounced psychological “issues.” But I wouldn’t know what to say. They seem like angry, lonely victims of learned helplessness. But maybe there’s more to them than I realize.