clock menu more-arrow no yes

Filed under:

Taylor Swift threatened Microsoft with legal action over its racist Tay chatbot

New, 26 comments

The AI bot lasted less than 24 hours

Taylor Swift’s lawyers threatened to sue Microsoft over the company’s Tay chatbot. The Guardian reports that a new book by Microsoft president Brad Smith reveals lawyers for Taylor Swift weren’t happy with the company using the name Tay for its chatbot. Microsoft’s chatbot was originally designed to hold conversations with teenagers over social media networks, but Twitter users turned it into a racist chatbot in less than a day.

Smith checked his emails during a vacation and found out that Taylor Swift’s team was demanding a name change for the Tay chatbot. “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’” The lawyers argued that “the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws,” says Smith in Tools and Weapons, a new book about how technology is both empowering us and threatening us.

It’s not clear exactly when Taylor Swift’s lawyers contacted Microsoft about the Tay name, but they probably weren’t happy about the sorts of misogynistic, racist, and Hitler-promoting junk it was publishing to Twitter. Microsoft quickly apologized for the offensive material posted by its AI bot and pulled the plug on Tay after less than 24 hours.