News

Some people are unwittingly posting their private and sometimes mortifying conversations with the Meta AI chatbot to the ...
"Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should." ...
Marjorie charged $1 a minute to talk to Caryn AI, marketing it as “your virtual girlfriend.” She says in the first week she made $70,000 with some users talking to the bot for 10 hours ... she’s a ...
According to the AI overlords, this is the year of agentic AI. You may have seen Google announce its " agentic era " with a web browsing research assistant and an AI bot that calls nail salons and ...
A teenage boy shot himself in the head after discussing suicide with an AI chatbot that he fell in love with. Sewell Setzer, ...
"I want you, but I need to know you’re ready," the Meta AI bot reportedly said in Cena’s voice to a user identified as a 14-year-old girl. After telling the service that they wish to proceed ...
Elon Musk’s Grok AI went off the rails recently, inserting white genocide conspiracy theories into unrelated queries. Here's what happened, why it matters, and why you shouldn't trust chatbots.
Flirty, sexy, seductive, supportive. Your AI companion can be whatever you want her to be. And now a growing number of men are turning to bots to ease their loneliness or satisfy their kinks. The ...
AI advocates argue these chatbots could act as a bridge – a way to support kids in the interim, or alongside traditional therapy, especially for those who might otherwise slip through the cracks.
Meta has a MoCha AI product of its own that can be used to create talking AI characters in videos that might be good enough to fool you. Tech. Entertainment. Science. Your inbox.
The idea is to make owners feel like they’re having conversations with their pet when really, they’re talking to a chatbot on the collar. “We start with states of being,” McHale says.
The Barbie Box AI trend is taking over the Internet, but an expert warns that it could be putting innocent ChatGPT users in danger of their images being used as deepfakes ...