One of the strangest deepfakes apps, the artificial intelligence technology used to manipulate audiovisual content, is the audio deepfake scam. Hackers use machine learning to clone someone’s voice, and then combine that voice clone with social engineering techniques to convince people to move money where it shouldn’t be. Such scams have been successful in the past, but how good are the voice clones used in these attacks? We have never actually heard the audio of a fake scam, until now.
Security consulting firm NISOS has released a report analyzing one of those fraud attempts and has shared the audio with Motherboard. The following clip is part of a voicemail sent to an employee of an unidentified technology company, in which a voice that sounds like the CEO of the company asks the employee for “immediate assistance to finish an urgent business.”
The quality is certainly not excellent. Even under cover of a bad phone signal, the voice is a bit robotic. But it is passable. And if you were a minor employee, worried after receiving a supposedly urgent message from your boss, you may not be thinking too much about the audio quality. “It definitely sounds human. They checked that box for: Does it sound more robotic or more human? I would say more human, “said Rob Volkert, a NISOS researcher. Motherboard. “But he doesn’t seem to be the CEO enough.”
The attack ultimately failed, as the employee who received the voice message “immediately thought it was suspicious” and flagged it to the firm’s legal department. But such attacks will become more common as deepfake tools become increasingly accessible.
All you need to create a voice clone is access many recordings of your target. The more data you have and the better audio quality, the better the resulting voice clone will be. And for many executives in large companies, such recordings can easily be gathered from earnings calls, interviews, and speeches. With enough time and data, the highest quality audio deepfakes are much more compelling than the previous example.
The best-known and first-reported example of a deep forgery audio scam took place in 2019, where the CEO of a UK energy company was tricked into sending € 220,000 ($ 240,000) to a Hungarian supplier after Receive a phone call allegedly from the CEO of their parent company of the company in Germany. The executive was told that the transfer was urgent and that the funds should be sent within one hour. He did it. The attackers were never caught.
Earlier this year, the FTC warned of the rise of such scams, but experts say there is an easy way to beat them. As Patrick Traynor of the Herbert Wertheim College of Engineering said The edge in January, all you need to do is hang up the phone and call the person back. In many scams, including the one reported by NISOS, attackers are using a VOIP account to contact their targets.
“Hang up and call back,” says Traynor. “Unless you’re a state actor who can redirect phone calls or a very sophisticated hacking group, that’s most likely the best way to find out if you were talking to who you thought you were.”