News
This isn't the first open letter about hypothetical, world-ending AI dangers that we've seen this year. In March, the Future of Life Institute released a more detailed statement signed by Elon ...
NPR's Audie Cornish speaks with Stuart Russell, an artificial intelligence researcher, and the force behind the open letter that warns about the dangers of autonomous weapons.
AI or artificial intelligence is a risk to humanity, at par with the pandemic and nuclear war, according to an open letter, signed by more than 350 executives, researchers and engineers.
Results that may be inaccessible to you are currently showing.
Hide inaccessible results