It’s a nightmare scenario: You receive a phone call from an unknown number and, when you pick it up, it’s the voice of your loved one frantically telling you they’ve been kidnapped. Although your first instinct is to call the police, they tell you that their kidnappers will kill them soon unless you immediately wire a large sum of money to a bank account. There’s no time to think. What do you do?
While scary, the person on the other side of that call might not be your loved one at all. In fact, it may not even be a person—but rather an artificial intelligence clone of your loved one’s voice to trick you into sending money.
In the past year, these voice scams have exploded as deepfake technology becomes more powerful and widespread. It’s staggeringly simple to pull off too: All a scammer needs is a short clip of someone’s voice. From there, the AI can replicate it to say practically anything.
ADVERTISEMENT
Such technology has already been used to scam innocent people out of their livelihoods. On at least one occasion, it was even used to steal $243,000 from an energy company. As AI technology advances at a breakneck pace, these tools will only become even more powerful—and dangerous.
That’s why researchers at Washington University in St. Louis have developed a tool called AntiFake that they say is capable of preventing even the most common AI deepfake software from replicating your voice. A paper of their findings was presented at the Association for Computing Machinery’s Conference on Computer and Communications Security in Denmark on Nov. 27.
“Our motivation was to take a proactive approach to [voice deepfakes],” Zhiyuan Yu, a speech and AI researcher at Washington University and co-author of the paper, told The Daily Beast. “We saw a lot of scammers pretending to be peoples’ parents or friends to conduct fraud. Or people trying to use AI to pretend to be presidents and declaring war.”
AntiFake works by taking your voice and making it harder for AI deepfake tools to process it properly by encoding another layer of noise over it. So, instead of cloning your voice, the result is a distorted version of you speaking—making it much easier for other people to tell that it’s not you.
“We add noises to the original samples in such a way that it will not ruin the original usability of the samples,” Yu explained. “When the process samples are used for AI synthesis, the synthesized speech will be unlike the original person.”
The team tested the tool using more than 60,000 voice samples that were put through different deepfake software and three different speaker verification systems. They found that AntiFake offered a protection rate of more than 95 percent—meaning that it was capable of preventing the deepfake software from effectively cloning the person’s voice.
Before you head to the Apple IOS or Google Play store to download AntiFake, you should know that it’s not available as an app quite yet. Instead, curious and more tech-savvy users can download the source code for free.
Yu said that there are plans to eventually build and release an app version of AntiFake that users will be able to put onto their mobile devices and computers. Everytime you go to speak on the phone or record your voice, you would simply go to the app, press a few buttons, and voila: Your voice is more protected from AI.
There were a number of limitations to the tool, Yu explained. For one, it’s not “future proof,” which means that future cybercriminals may be able to develop tools and workarounds to AntiFake.
Additionally, the noise AntiFakes add to your voice can actually be removed using AI. This step is no doubt more tedious for scammers and makes the process more complicated—but it can be done. “That reflects that our defense isn’t bulletproof,” Yu said. “However, we aim to further improve upon it for sustained protection.”
The fight against cybercrime is not getting easier—especially in a time when it seems like there’s groundbreaking developments in AI with each passing day. However, tools like AntiFake get us one step closer to a future where we can silence the scammers for good.