News
Less than a day after she joined Twitter, Microsoft's AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. AI experts explain why it went terribly wrong. She was supposed to come ...
Amid this dangerous combination of forces, determining exactly what went wrong is ... olds in the US. “Microsoft’s AI fam from the internet that’s got zero chill,” Tay’s tagline read.
BOT or NOT? This special series explores the evolving relationship between humans and machines, examining the ways that robots, artificial intelligence and automation are impacting our work and lives.
Tay’s meltdown was not in fact a case of robots gone rogue. The explanation was far simpler, for Microsoft engineers had made one fatal mistake: They’d programmed Tay to learn from her ...
Microsoft got a swift lesson this week on the dark side of social media. Yesterday the company launched "Tay," an artificial intelligence chatbot designed to develop conversational understanding ...
Here's a clear example of artificial intelligence gone wrong. Microsoft launched a smart chat bot Wednesday called "Tay." It looks like a photograph of a teenage girl rendered on a broken computer ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results