In 2024, a finance employee at a multinational company was tricked into transferring $25 million USD after a video call with deepfake versions of the company's CFO and other executives. This is no longer science fiction — AI voice and video cloning fraud is here, and it's targeting ordinary families.
How AI Voice Cloning Works
Modern AI tools can clone a person's voice from as little as 3 seconds of audio. Scammers harvest voice samples from social media videos, voicemail greetings, or public recordings. Then they call a target — often a parent or grandparent — impersonating a family member in crisis.
The "Grandchild in Trouble" Call — Evolved
The traditional grandparent scam relied on emotional manipulation. Now, scammers use AI to make the call sound exactly like your grandchild's voice, panicking victims into wiring money immediately.
Red Flags of AI Voice Fraud
- Urgent request for money via wire transfer, gift cards, or cryptocurrency.
- Request to keep it secret from other family members.
- Caller insists on staying on the phone while you send money.
- Voice seems slightly "off" or robotic under pressure.
- Call comes from an unknown number, even if the voice is familiar.
How to Protect Yourself
- Create a family code word — A secret word only family members know. Anyone claiming to be family in an emergency must say the code word.
- Always hang up and call back — Call the family member directly on a number you already have saved.
- Never send money based on a phone call alone — No matter how real the voice sounds.
- Limit public voice samples — Be mindful of voice content on social media, especially for elderly relatives.
- Report to the FTC at reportfraud.ftc.gov.
Sources: FTC; FBI IC3; Reuters (Hong Kong deepfake scam report, 2024).