Jennifer DiStefano, a mother of four, got a call one day from an unknown number. Two of her children were off snowboarding, so she picked up, worried that one of them might have been injured. It was her daughter Bree, screaming, crying and pleading for help. A man came on the line and told DiStefano that he had kidnapped her daughter and that if she didn’t pay up, he would kill her.
DiStefano was terrified, but her fear and horror was the only real thing about that phone call. Bree had not been kidnapped, she was with her brother, safe. Instead, scammers had used AI to replicate Bree’s voice so accurately that her own mother could not recognise the difference – and they were using it to try to extort money from DiStefano.
Oliver Devane, a senior researcher at the computer security company McAfee, says AI scams such as this are on the rise. Michael Safi hears how criminals are exploiting artificial intelligence to trick their victims, and how we can protect ourselves from falling for it.
Support The Guardian
The Guardian is editorially independent.
And we want to keep our journalism open and accessible to all.
But we increasingly need our readers to fund our work.