AI voice clones that can trick family and break into bank accounts are already a threat, experts have warned.
Cyber thieves are increasingly scouring for clips of people’s voices on social media to mimic them in potentially devastating scams.
Easy AI tools make it super simple for fraudsters to rip off a person’s voice with little technical knowledge.
And one of the easiest places for them to look for audio of you speaking is on social media sites such as Facebook, Instagram and LinkedIn.
The more data the easier it is, so obviously having your voice out there is an inherent risk
Dr Luca Arnaboldi
“AI-generated voices are already a threat to personal identity in at least two contexts,” Dr Lo, a lecturer in security and protection science at Lancaster University, told The Sun.
“First, the voice can be used to breach speaker verification software in order to gain access to systems such as bank accounts.
“Second, the AI-generated voice can be used to deceive other human listeners.
“The technology already exists to produce speech which is indistinguishable to most listeners in best case scenarios, but there is still a lot of variation in the quality of AI-generated speech.”
Some sites claim they only need as little as ten seconds of audio from a person to pull off the vicious stunt.
Dr Luca Arnaboldi, a cyber security professor at the University of Birmingham said he’s sceptical of how good they’ll be with ten seconds – but he’s tested others needing 50 seconds that work better.
He told The Sun users shouldn’t be responsible for protecting their data – it should be down to social network sites.
“The more data the easier it is, so obviously having your voice out there is an inherent risk,” he told The Sun.
“We as users should be conscious of what we share but I don’t think it should be on the user.
“I think part of the burden has to be on companies.”
HOW TO CATCH OUT AI VOICE CLONES
AI voice clones may be getting smarter but they have plenty of limitations – and knowing them can help you avoid a nasty account hack.
Firstly, there are usually quite scripted and the fraudster on the other end may take time typing out a response, so listen out for irregular, delayed speech.
Fraudsters also love creating urgency and panic so you make decisions without thinking them through.
So if you get a call from a loved one on an unknown number this should be a massive red flag. Hang up and try ringing them from the number you have saved to see if it’s real.
And if you’re really concerned about AI voice clones, there’s a clever trick you can set-up between family – using a safe word.
Find out how it works here.
Dr Lo added: “Social media accounts have been prime targets for personal information even before the rise of deepfakes, so taking care around how much, and what kind of, information you put on social media applies not only to the voice.
“Given that the voice can be vulnerable to impersonation, it might be worth thinking carefully about how much we wish to rely on it as the sole or primary means of verifying identity in different parts of our personal lives.”
Read more about Artificial Intelligence
Everything you need to know about the latest developments in Artificial Intelligence