Digital Security
AI-driven voice cloning can make things far too easy for scammers – I know because I’ve tested it so that you don’t have to learn about the risks the hard way.
22 Nov 2023
•
,
6 min. read
The recent theft of my voice brought me to a new fork in the road in terms of how AI already has the potential of causing social disruption. I was so taken aback by the quality of the cloned voice (and in that extremely clever, yet comedic, style by one of my colleagues) that I decided to use the same software for “nefarious” purposes and see how far I could go in order to steal from a small business – with permission, of course! Spoiler alert: it was surprisingly easy to carry out and took hardly any time at all.
“AI is likely to be either the best or worst thing to happen to humanity.” – Stephen Hawking
Indeed, since the concept of AI became more mainstream in fictional films such as Blade Runner and The Terminator, people have questioned the relentless possibilities of what the technology could go on to produce. However, only now with powerful databases, increasing computer power, and media attention have we seen AI hit a global audience in ways that are both terrifying and exciting in equal measure. With technology such as AI prowling among us, we are extremely likely to see creative and rather sophisticated attacks take place with damaging results.
Voice cloning escapade
My previous roles in the police force instilled in me the mindset to attempt to think like a criminal. This approach has some very tangible and yet underappreciated benefits: the more one thinks and even acts like a criminal (without actually becoming one), the better protected one can be. This is absolutely vital in keeping up to date with the latest threats as well as foreseeing the trends to come.
So, to test some of AI’s current abilities, I have once again had to take on the mindset of a digital criminal and ethically attack a business!
I recently asked a contact of mine – let’s call him Harry – if I could clone his voice and use it to attack his company. Harry agreed and allowed me to start the experiment by creating a clone of his voice using readily available software. Luckily for me, getting hold of Harry’s voice was relatively simple – he often records short videos promoting his business on his YouTube channel, so I was able to stitch together a few of these videos in order to make a good audio test bed. Within a few minutes, I had generated a clone of Harry’s voice, which sounded just like him to me, and I was then able to write anything and have it played back in his voice.
To up the ante, I also decided to add authenticity to the attack by stealing Harry’s WhatsApp account with the help of a SIM swap attack – again, with permission. I then sent a voice message from his WhatsApp account to the financial director of his company – let’s call her Sally – requesting a £250 payment to a “new contractor”. At the time of the attack, I knew he was on a nearby island having a business lunch, which gave me the perfect story and opportunity to strike.
The voice message included where he was and that he needed the “floor plan guy” paid, and said that he would send the bank details separately straight after. This added the verification from the sound of his voice on top of the voice message being added to Sally’s WhatsApp thread, which was enough to convince her that the request was genuine. Within 16 minutes of the initial message I had £250 sent to my personal account.