Deepfake is a mashup of the words deep learning and fake. The technology uses artificial intelligence and machine deep-learning algorithms. This can create convincing representations of people for special effects or silly videos, but these fake videos or images can also be more dangerous.
Malicious deepfakes spread false information or can defame or scam people. That’s what we want to talk about in more detail here: Deepfake voice scams.
What is a deepfake voice scam?
This type of scam manipulates synthesized speech to convince you someone is saying something they didn’t actually say. This increasingly common scam tricks you into providing sensitive information or sending money.
Criminals first record a voice sample from their victim. They might use speeches, TikTok or YouTube videos, podcasts, or phone conversations. Then, they turn to a tool such as ElevenLabs, Resemble, Overdub, ReadSpeaker, or Voice.ai. These platforms analyze speech patterns and create a voice mimicking the original. The bad actors can then generate a new speech that sounds like the original speaker said it. They script it, and the Ai voice says it.
Examples of deepfake scams include creating a voice that mimics a family member. They'll script a request for help in an emergency situation. Or you might get a call from a lawyer claiming to need payment to help defend a family member.
You might also hear from a celebrity who wants you to donate to their charity. The fake voice might also ask for sensitive information such as banking details. After all, who wouldn’t trust Liam Neeson if he called personally?
A tech support scam is another common one. The scammer creates a voice for a customer support rep from a prominent company. They request remote access to your computer to "fix" a non-existent problem. Instead, they’ll steal sensitive information such as login credentials, or install malware.
How to defend against deepfake scams?
This technology does a good job, and the scam can be very convincing. Be cautious of unexpected requests for personal information or money made by phone. Be especially suspicious if the request makes an emotional appeal to you to act now.
Confirm before you share sensitive data or transfer money. For example, if you’re asked to pay a lawyer to help out your grandson in an accident, check in with him first. Or, if someone calls from your internet service provider, use a trusted phone number to confirm their authenticity.
We can help you combat deepfake scams. We can install email and Web filtering, multi-factor authentication (MFA), and endpoint protection. Our IT experts can also watch networks for signs of attack and respond to cut potential damage. Call us today at (888) 234-WDIT(9348).