Skip to main content

PSA: AI voice cloning and call spoofing create scary convincing scams, here’s how to protect yourself

As technology advances, scams are becoming quite sophisticated. One of the latest threats is AI voice cloning which a malicious party can use to make it seem like they have a loved one held hostage. Mix that with caller ID spoofing and it’s a very scary and convincing scam that can impact users on iPhone, Android, and really any phone. Read on for more details and how to protect against AI voice clone and caller ID spoofing scams.

Caller ID spoofing

Over the years, caller ID spoofing or number spoofing has become more of an issue. This is when an attacker is able to make a call and fake the incoming phone number. This is used to make it appear that someone your know or trust is calling.

For example, it can look like your sister is calling with even the contact image pulling up on your phone. Scammers using this technique may say something like they have your loved one hostage, demand you don’t call the police, and that you need to send money immediately. These attacks may include background sounds like muffled cries to make it feel very real and threatening.

Another tactic is saying your loved one was in an accident and you need to send money right away.

How to protect against caller ID spoofing scams

  1. If you’ve already answered, a fast way to know if a caller has spoofed a loved one’s number is to hang up and call that person’s number from your phone – it’s very unlikely the attackers can intercept your outgoing call to the actual number
  2. Ask a challenge question – something only your loved one would be able to answer (don’t ask a question to which the answer could be found on social media, online, etc.)
  3. The FCC recommends never answering calls from unknown numbers – let them go to voicemail – but of course, the power of spoofed calls is they appear to be someone you know
  4. Don’t share your phone number on social media, online, etc. if possible, and encourage loved ones to not share their numbers publicly

Caller ID spoofing is a common enough problem that the FCC has a support document with some more tips including never giving personal information away if you receive and answer a call like this.

AI voice clone scams

Even though caller ID spoofing can be very convincing and scary, the next scam is more terrifying. Attackers are using AI voice clone attacks in the wild which are incredibly realistic.

How to protect against AI voice clone scams 1 caller ID spoofing scams

Just this month, an Arizona mother received a call from an unknown number but it was her daughter who was crying for help. The scammer/attacker then got on and threatened to hurt her daughter if the mother didn’t hand over ransom money.

Fortunately, she had friends around who were able to confirm her daughter was safe within four minutes which made her realize it wasn’t actually her daughter on the phone with her. But the level of accuracy of AI voice clone really shook her. “It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

How to protect against AI voice clone scams

  1. Ask a challenge question or even two – something only your loved one would be able to answer (e.g. name your childhood stuffed animal, etc.)
    • Remember, don’t ask a question to which the answer could be found on social media, online, etc.
  2. If possible, have someone call or text the person directly that the scammer is claiming needs help
  3. Letting unknown numbers go to voicemail may help, but if the attackers are able to leave a voicemail with your loved one’s voice, it could sound real
  4. Set your social media profiles to private – many attackers look for voice samples from public social media profiles to generate the convincing AI voice clone
    • It’s believed as little as 3 seconds of someone’s voice is needed to create a realistic clone
  5. Don’t share your phone number on social media if possible

Unfortunately, AI voice cloning is becoming common enough that the FTC has shared a warning about it.

AI voice cloning plus caller ID spoofing

Sadly, it’s probably just a matter of time before advanced attackers will use both of these tactics simultaneously to create an even more convincing threat. That would allow them to call you from what appears as a loved one’s phone number as well as make it sound exactly like their voice on the other end.

Another terrifying evolution of this could also include AI deep fake video of a loved one. The same steps shared above will help protect you and loved ones.

Thanks for reading our guide on how to protect against AI voice clone and caller ID spoofing scams, stay safe out there!

More on security and privacy:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Michael Potuck Michael Potuck

Michael is an editor for 9to5Mac. Since joining in 2016 he has written more than 3,000 articles including breaking news, reviews, and detailed comparisons and tutorials.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing