News
Girls and women are becoming the targets of AI-created fake pornography, as high schools and even middle schools are upended by scandal and betrayal, made by apps which remove clothing from photos.
It was a slow Friday afternoon in July when a seemingly isolated problem appeared on the radar of Phillip Misner, head of Microsoft’s AI Incident Detection and Response team. Someone had stolen a ...
Getty Images is spending "millions and millions of dollars" on its legal case against Stability AI, the photo licensing company's CEO Craig Peters told CNBC.
In recent years, people ranging from Taylor Swift and Rep. Alexandria Ocasio-Cortez to high school girls around the country have been victims of non-consensual, explicit deepfakes — images where ...
The Take It Down Act is the first federal law to include criminal penalties for creating and posting AI-generated deepfakes, as well as for threatening to post intimate images without consent.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results