clock menu more-arrow no yes

Filed under:

Microsoft's AI chatbot briefly came back online and spammed everyone

New, 24 comments

"You are too fast, please take a rest..."

Update 6:11AM ET: Whatever the problem was Microsoft has dealt with it. Tay's account is now locked and all its tweets from this morning have been deleted. It's not clear exactly what happened, but Microsoft said in a statement: "Tay remains offline while we make adjustments. As part of testing, she was inadvertently activated on Twitter for a brief period of time." The original story follows.

Tay's back. Microsoft AI chatbot is tweeting again after being taken offline following a bout of sudden racism last week. At some time around 3:30AM ET this morning, Tay started firing out tweets to other users, with one particular message repeated over and over again: "You are too fast, please take a rest..." Indeed, if you're on Twitter and following Tay then it's likely your timeline was briefly swamped by this phrase.

It's impossible to know what's going on under the hood, but it's safe to say that Tay's handlers are not fully in control of the bot. Perhaps the "You are too fast" phrase is a preprogrammed response for when Tay receives a lot of messages, and in being turned back on, the bot has been deluged with tweets and is telling everyone to slow it down. Some users have also pointed out that Tay has a problem not replying to itself, which may also be causing trouble.

There are some normal messages mixed in with "You are too fast" phrase, but they're definitely outnumbered. Last week, Microsoft apologized for Tay's "offensive and hurtful tweets" — is the company now going to have to say sorry for all the spam? The company has also locked Tay's account, meaning no more retweets or embeds of the bot's messages. It's a shame, as despite the spam, Tay is still occasionally coming up with some fire tweets:


Microsoft Build: HoloLens Skype demo