Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge
Microsoft's Tay is an Example of Bad Design | by caroline sinders | Medium
Tay (bot) - Wikipedia
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours
Microsoft's racist robot: "Chatbot" taken offline as Tweets turn off-colour - YouTube
Tay: Microsoft issues apology over racist chatbot fiasco - BBC News
Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge
Microsoft chatbot is taught to swear on Twitter - BBC News
In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online Conversation - IEEE Spectrum
Microsoft exec apologizes for Tay chatbot's racist tweets, says users 'exploited a vulnerability' | VentureBeat
Kotaku on Twitter: "Microsoft releases AI bot that immediately learns how to be racist and say horrible things https://t.co/onmBCysYGB https://t.co/0Py07nHhtQ" / Twitter
Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.