AI-generated voice scams utilize artificial intelligence to create realistic imitations of voices. Scammers use these voice replicas to impersonate relatives, friends, or financial institutions during phone calls to deceive unsuspecting victims.
The goal is often to trick older adults into revealing personal information like Social Security numbers, bank account details, or passwords, or to persuade them to transfer money or make fraudulent purchases.
Detecting these scams can be difficult as synthesized voices sound remarkably similar to real ones. However, there are several red flags:
- Unsolicited Calls: Be wary of unexpected calls, especially if the caller claims to be a relative or friend in distress or a representative from a known organization asking for sensitive information or urgent action.
- Pressure Tactics: Scammers often use high-pressure tactics to elicit a quick response, claiming immediate action is necessary to resolve an urgent issue or prevent a disaster.
- Inconsistencies in Story: Pay attention to discrepancies in the caller’s story, such as changes in details or unusual requests that seem out of character.
- Verification: When in doubt, verify the caller’s identity by asking questions only the real person would know or by contacting the individual or organization directly using a known, trusted phone number.
To protect against AI-generated voice scams, consider these precautions:
- Establish Verification Procedures: Set up clear procedures for verifying the identity of callers claiming to be relatives or representatives.
- Use Caller ID: Use caller ID features to screen incoming calls and identify unknown numbers. If a call seems suspicious, let it go to voicemail and review the message before responding.
- Stay Vigilant: Encourage vigilance and trust in instincts. If something feels off or too good to be true, proceed with caution and seek assistance if needed.
AI-generated voice scams are a growing threat. Verify the caller’s identity before sharing any personal or financial information.