Don't mess with Tay Tay. Pop superstar Taylor Swift apparently tried to stop Microsoft from calling its chatbot Tay after the AI-powered bot morphed into a racist troll, according to Microsoft ...
When an artificially intelligent chatbot that used Twitter to learn how to talk unsurprisingly turned into a bigot bot, Taylor Swift reportedly threatened legal action because the bot’s name was Tay.
Tay was a huge hit with online miscreants, who cajoled the chatbot into repeating racist, sexist, and anti-Semitic slurs. I'm PCMag's managing editor for consumer electronics, overseeing an ...
Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Oh, Microsoft. Last week, the company pulled its Tay chatbot from Twitter after some users trained it to become a racist jackass. On Wednesday, Tay was brought back online, sending thousands of tweet ...
Taylor Swift tried to block Microsoft from using the moniker Tay to a chatbot that turned into a depraved racist troll vessel, according to a new book from Microsoft President Brad Smith. In March ...