Taylor Swift once threatened to sue Microsoft over its chatbot Tay, which Twitter manipulated into a bile-spewing racist

Advertisement
Taylor Swift once threatened to sue Microsoft over its chatbot Tay, which Twitter manipulated into a bile-spewing racist

taylor swift

Andrew H. Walker/Getty

Singer Taylor Swift.

Advertisement
  • Microsoft once received a legal threat from Taylor Swift, the Guardian reports.
  • The threat was in relation to an AI-powered chatbot Microsoft created to interact with people on social media, called Tay.
  • When the bot was given its own Twitter account, it was quickly manipulated by trolls into spewing racist remarks and abuse.
  • Visit Business Insider's homepage for more stories.

An anecdote from Microsoft president Brad Smith's upcoming book reveals that he once received a legal threat from Taylor Swift over a chatbot.

The chatbot in question was called XiaoIce, and was designed to converse with real people on social media. However, its US name was Tay.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

In his forthcoming book "Tools and Weapons," per the Guardian's Alex Hern, Smith says he was on vacation having dinner when he received a message.

Read more: Microsoft President Brad Smith says these are the 10 biggest challenges facing tech in 2019

Advertisement

"An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: 'We represent Taylor Swift, on whose behalf this is directed to you,'" Smith writes.

"He went on to state that 'the name Tay, as I'm sure you must know, is closely associated with our client.' No, I actually didn't know, but the email nonetheless grabbed my attention. The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws."

In 2016 Tay was given its own Twitter account where it could learn from its interactions with Twitter users. Unfortunately, it was quickly manipulated to spew horrendously racist tweets, at one point denying the holocaust.

Microsoft shut Tay down after less than 24 hours.

tay holocaust microsoft

Twitter

Advertisement

Smith says that the incident taught him, "not just about cross-cultural norms but about the need for stronger AI safeguards."

Business Insider contacted Taylor Swift's representation for comment on the incident, and whether the matter was resolved to her satisfaction. They were not immediately available for comment.

Get the latest Microsoft stock price here.

{{}}