YOUR VOICE IS NOW A WEAPON: HOW DEEPFAKE VOICE SCAMS ARE TARGETING FAMILIES
Scammers can clone your voice from just 3 seconds of audio. One in four adults have already been targeted. Here's the one rule that can protect your family.
Your voice is now a weapon
In July 2025, Sharon Brightwell got a call from her daughter.
Crying. Panicked. Said she'd been in a car accident and lost her unborn baby.
She needed $15,000 for legal fees. Immediately.
Sharon wired the money within hours.
It wasn't her daughter. It was an AI clone of her daughter's voice, generated from clips scraped off social media.
This isn't science fiction. It's Monday.
The terrifying math
Scammers now need just three seconds of your voice to create an 85% accurate clone. A voicemail. A TikTok. A work webinar.
That's all it takes.
One in four adults have already encountered an AI voice scam. And 77% of people who receive a cloned voice message lose money.
The problem? Our brains are wired to trust familiar voices. When it sounds like your kid, your boss, or your CFO, rational thinking shuts down.
The callback rule
So here's what I call The callback rule:
If anyone calls asking for money or sensitive info, hang up. Call them back on a number you already have saved.
Not the number they called from. Not the number they give you. A number you trust.
Your one takeaway
Create a family safe word for emergencies that only real family members would know. Agree on it today. Use it any time someone calls with an urgent request.
The scammer has your voice. They don't have your secret word.
Share this with your parents. They're the most targeted, and the least prepared.