By now, everyone has heard of deepfake scams. With the rise of AI, the ability of scammers to pose as a trusted partner, a legitimate business, or even a family member is a real concern.
Generative AI scams can create a fake phone call or even video to impersonate someone in immediate trouble. Other times, the fake call might not be from someone claiming to be in trouble – but it does use an AI-generated voice that sounds just like someone you know and trust, and they’re asking you to make a certain purchase or move some money around.
The way AI can capture someone’s voice, their speech patterns or even their mannerisms can be very convincing. Likewise, AI fraudsters can create deepfake emails so convincing you drop all defenses to respond to questions or a crisis and turn over all your information.
Regions Bank is working to raise awareness of these scams. The more people recognize the threat, the better they are prepared to keep their money safe.
“Deepfakes will continue to grow as criminals master the technology, but there is some good news,” explained Jeff Taylor, head of Commercial Fraud Forensics for Regions.
“The good news is the same rules of protecting your money still apply. That is, never give out sensitive information to an unsolicited caller. If someone is pretending to be your bank, your relative, or your friend and saying they suddenly need money because they’re in trouble or they need your password or other information, we recommend you stay silent. Contact them on your own at their verified phone number, not just some number you see on Caller ID or a search engine. And tell them what’s happening.”
But what if a scammer has managed to forward, say, a relative’s phone number to their own phone as part of their scheme?
Taylor added many families, aware of the rise in deepfakes, have a hard-to-guess “family password” known only among themselves. He pointed to a colleague whose high school-aged son came home one day talking about deepfakes and how, if the family could settle on a unique “password” (and certainly not a password any of them use on their accounts), they could spot a deepfake call.
“The idea was, if the mom got an AI call with a voice sounding like her son, and he claimed he was in trouble and needed cash, she could ask him what the family password is,” Taylor said. “Because the family had taken the time to discuss deepfakes – and come up with a very unique password no one would be likely to guess – they’d be more likely to spot a deepfake call.”
More Ways to Spot a Deepfake
“Deepfakes are becoming increasingly sophisticated and harder to detect,” said Sam Kunjukunju, vice president of consumer education for the American Bankers Association Foundation. “The FBI and the American Bankers Association provide practical tips to help consumers recognize red flags and protect themselves from these deceptive schemes.”
Look for inconsistencies that don’t quite add up. Ask yourself:
- Are any of the facial features blurry or distorted?
- Does the person blink too much or too little?
- Do the hair and teeth look real?
- Are the audio and video out of sync?
- Is the voice tone flat or unnatural?
- Do you see odd shadows or lighting that don’t match the setting?
Scams Are Surging
Imposter scams are exploding in the AI era. Since 2020, the FBI has received more than 4.2 million reports of fraud — representing $50.5 billion in losses.
Still, there are common red flags to watch for in a deepfake scam:
- Emotional manipulation involving fear or urgency
- Unexpected requests for money, passwords, personal information, or secrecy
- Messages or calls that sound “off” or uncharacteristic from someone you know
Tips to Stay Safe
- Stop and think. Is someone trying to scare or pressure you into sending money or personal information?
- Verify legitimacy. Use trusted numbers, official websites and online search tools before responding.
- Create codewords. Establish phrases with family or friends to confirm identity.
- Limit your digital footprint. Every photo, voice clip or video you post can be used to train deepfake models.
- Don’t repost blindly. Verify the source before sharing videos or images.
Reporting Scams
If you suspect you’ve been targeted, take the following steps:
- File a report with the FBI at IC3.gov
- Notify your bank immediately if you sent money
- Contact your local police
“Implementing the right controls can help protect you from becoming a victim,” added Regions’ Taylor. “One of the simplest but most effective safeguards is Regions Bank’s Stop, Call, Confirm method.
Pause before reacting; call a trusted number; ask the right questions, and confirm the request is real. Taking that extra step can make all the difference in stopping scammers in their tracks.”
The information presented is general in nature and should not be considered, legal, accounting or tax advice. Regions reminds its customers that they should be vigilant about fraud and security and that they are responsible for taking action to protect their computer systems. Fraud prevention requires a continuous review of your policies and practices, as the threat evolves daily. There is no guarantee that all fraudulent transactions will be prevented or that related financial losses will not occur. Visit regions.com/STOPFRAUD or speak with your banker for further information on how you can help prevent fraud.