Microsoft is deleting its AI chatbot's incredibly racist tweets
The tech company introduced "Tay" this week - a bot that responds to users' questions and emulates the casual, jokey speech patterns of a stereotypical millennial.The aim was to "experiment with and conduct research on conversational understanding," with Tay able to learn from "her" conversations and get progressively "smarter."
Microsoft has now taken Tay offline for "upgrades," and is deleting some of the worst tweets - though many still remain. Microsoft did not immediately respond to a request for comment.It's important to note that Tay's racism is not a product of Microsoft or of Tay itself. Tay is simply a piece of software that is trying to learn how humans talk in a conversation. Tay doesn't even know it exists, or what racism is. The reason it spouted garbage is because racist humans on Twitter quickly spotted a vulnerability - that Tay didn't understand what she was talking about - and exploited it.
Nonetheless, it is hugely embarrassing for the company.In one highly publicised tweet, which has since been deleted, Tay said: "bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got." In another, responding to a question, she said "ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism."
Zoe Quinn, a games developer who has been a frequent target of online harassment, shared a screengrab showing the bot calling her a "whore." (The tweet also seems to have been deleted.)
Wow it only took them hours to ruin this bot for me.- linkedin park (@UnburntWitch) March 24, 2016
This is the problem with content-neutral algorithms pic.twitter.com/hPlINtVw0V
Here's Tay denying the existence of the Holocaust:
Here's another series of tweets from Tay in support of genocide.
It's clear that Microsoft's developers didn't include any filters on what words Tay could or could not use.
Same as YouTube's suggestions. It's not only a failure in that its harassment by proxy, it's a quality issue. This isn't the intended use.- linkedin park (@UnburntWitch) March 24, 2016
Didn't Pepsi just get tricked into their not tweeting out Mein Kampf? Are we just gonna keep making the same mistakes here or...?- linkedin park (@UnburntWitch) March 24, 2016
It's 2016. If you're not asking yourself "how could this be used to hurt someone" in your design/engineering process, you've failed.- linkedin park (@UnburntWitch) March 24, 2016
And it's not you paying for your failure. It's people who already have enough shit to deal with.- linkedin park (@UnburntWitch) March 24, 2016
Microsoft did not immediately respond to a request for comment.
- Sonu Sood to launch blood bank app to connect donors with those in urgent need of blood
- India has the third highest number of women in senior management positions globally, according to Grant Thornton
- RIL, banks lift Sensex 1100 points higher, Nifty tops 15,200 mark
- Gujarat's annual budget for the next financial year has a surplus of ₹588 crore
- COVID-19 vaccine doses will available around the clock in India