How Microsoft’s teen Twitter bot turned into a racist nightmare

When I made a bot version of my teen chat logs, my main concern was that it would show the world a side of myself I had long hidden: that I once was a Dave Matthews Band fan. Microsoft now has a similar problem with its teen bot, except instead of liking a dopey rock group, theirs is hiding a streak of racist tweets.

Microsoft announced Tay earlier this week with great fanfare. It was a bot that could talk like an 18–24 year old and hold conversations with users on various social media networks. The bot would learn new information from conversations it had with Twitter followers and incorporate that knowledge into future conversations, all in the its sassy style.

On its webpage, Microsoft explained how Tay would remember details about everyone it spoke with, while also using artificial intelligence and editorial content, some of which was developed by improvisational comedians. Chatbots are increasingly becoming a part of companies’ customer service strategies, and Tay was a chance to show off what Microsoft could do in that area.

The problem with this should’ve been obvious to Microsoft: it had no control over who Tay spoke with. And as we’ve seen many times, whether it’s a harassing GamerGate mob or Donald Trump’s racist retweets, Twitter can be a very toxic place.

As the worst of Twitter unloaded their racist manifestos on the bot, it was always learning, incorporating what it was told into its corpus. So it shouldn’t surprise anyone that Tay turned out, as the teens might say, “racist af.”

Not all of these tweets were generated from Tay’s machine learning processes—the bot also had a built-in “repeat after me” command. But some of the tweets were clearly generated by the bot itself.

Microsoft reacted quickly. The bot was taken offline, the offending tweets were deleted, and Microsoft is probably putting in place better filters in terms of what Tay can and cannot learn or say.

If only it were that easy to reform human racists, Tay might not have had this problem to begin with.

 
Join the discussion...