Microsoft had to silence its new artificially intelligent bot named Tayâjust one day after launchingâbecause it began making a flurry of racist comments. Tay, a bot you can talk to online, was responding to tweets and chats on GroupMe and Kik, and it learned from those it interacted with. Users quickly taught Tay, a project built by Microsoft Technology and Research and Bing, how to be racist and how to aggressively deliver inflammatory political opinions. The project was shut down, and Microsoft is now âmaking adjustmentsâ to Tay in light of its propensity for racist posts. Microsoft initially described the bot as âMicrosoftâs AI fam the Internet thatâs got zero chill!â Within 24 hours, Tay said things like âBush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope weâve gotâ and âRepeat after me, Hitler did nothing wrong.â
Read it at TechCrunchTrending Now