We recently wrote about the danger of impersonation scams and highlighted how generative AI is making these scams more convincing, in particular ‘Vishing’ which usually centres around telephone based scams.
A more recent development, particularly worrying for professional and financial services firms, is voice cloning. This might sound like something from the Terminator but using the latest AI tools it is in fact incredibly easy to do and there have already been some well documented cases where it has been used to defraud businesses of very significant amounts of money.
Are people really fooled by AI voices?
A 2023 study by security company McAfee found that AI voice cloning tools can replicate a person’s voice with up to 95% accuracy and the evidence suggests that telling the difference is getting harder and harder as these tools improve.
If you are not convinced, then try listening to this recording of the author’s voice alongside this AI cloned version made using free tools on the internet. Tools on a paid subscription are even more accurate and only require a couple of minutes sample of the voice to be cloned to be uploaded.
To reinforce how accurate this technology has become, the BBC tried successfully to fool online banking security systems using AI cloned voices as part of their scam safe week.
So how do AI voice scams work?
Imagine getting a call from a financial advisor about investing more money in a scheme they are running for you. There is a limited time to do this and you are being given a last opportunity. You recognise the caller – it’s someone you have spoken to several times before from a firm you trust. However, in this case the person on the other end of the line is actually a criminal. They gained your details after a successful cyber attack on your financial advisor and used a video your advisor placed on LinkedIn to clone their voice.
This unfortunately is no longer science fiction; there are publicly available tools on the internet which can clone voices from recordings with worrying accuracy. Whilst there are many legitimate uses for this technology, the potential for criminal use is clear and it is already happening at scale. The McAfee study identified that 25% of respondents had experienced or knows someone who has experienced a voice cloning attack with nearly three quarters of those attacked being successfully duped and suffering some kind of loss.
What can you do about it?
AI voice cloning scams target both businesses and individuals. From a personal perspective ensure you have ‘code words’ with family members or trusted friends who might ask you to transfer money or deal in other confidential matters.
From a business perspective, code words can also help with internal company authorisations. However more important is having robust multiple step authorisation processes on financial transactions. Also try and authorise with multiple methods or call back the person you think you have received a call from on a known number.
To protect your clients, ensure you make it clear, both when onboarding but also reinforcing throughout your relationship, that you won’t give instructions over the phone. Use a secure client portal and have robust authorisation methods. It’s possible to use software solutions to have do secure authorisation using mobile phones although you can also use manual methods to achieve the same.
Above anything else, always be vigilant and if something feels even slightly suspicious, then treat it as a scam and hang up and call the person back on a known number. Remember these scams often target people who criminals can find voice recordings of online and these are often senior people in organisations. Don’t let their seniority prevent you from questioning the legitimacy of the call.
For a comprehensive review of where you are at risk to cyber scams, including vulnerability to voice cloning attacks, get in touch with our team on 0330 124 3599, where you will speak to a real person who can help you.