As soon as any new artificial intelligence (AI) tech comes out, scammers scramble to take advantage of it. Now, with AI-enabled cloning of people’s voices, likenesses, and handwriting, it’s harder and harder to tell truth from fiction. Despite these new challenges though, vigilance and staying on top of the latest AI scams puts you in control.
As each new egg of artificial intelligence (AI) tech hatches, cybercriminals are right there, ready to raid the henhouse.
It’s a neck-and-neck race between innovating scientists and opportunistic bad actors ready to use the tech to their advantage.
Given this ever-evolving technology is becoming a part of our daily lives, the best defense is to be aware of the latest crop of AI mimic scams from kidnapping schemes to forged signatures.
AI kidnapping schemes
Nothing is off-limits for scammers and the universal fear of a kidnapping is one of the latest to be exploited.
Preying on emotion isn’t new for scammers; it’s what social engineering scams are based on. Where love and romance scams take advantage of the human need for companionship, this new scam is solely focused on getting victims to act out of fear.
How this cruel scam typically plays out: You get a call from a number that might appear local, or it might be completely unrecognizable. You answer and hear your what sounds like your child — or another loved family member — asking for help, saying kidnappers have them.
Another voice then takes over and lists the terms of release, the amount of the ransom, and directions for payout and pick-up. They may even issue threats and ultimatums.
When it happened to one Arizona mother in early 2023, she kept the “kidnapper” on the line while a friend in the same room called and texted her former spouse. It turned out, her teen daughter was safe and sound.
The perpetrators had used mere seconds of the teen’s voice to make an AI-generated voice clone good enough to trick her mom.
Unfortunately, the threat of "deepfakes" like the one used in the Arizona kidnapping scam isn't limited to voice calls, either. Deepfakes can take the form of a video, voice, handwriting, signature, and more, and can mimic a real person, brand, or company.
In fact, one bank employee in Hong Kong walked into such a trap last year. The unsuspecting employee transferred $25.6 million to various accounts at the urging of their “boss” in a video conference call. The robbers made off with the spoils and, as of March 2024, have yet to be caught.
Protect yourself from AI kidnapping scams
Before a call like this happens, set yourself — and your family — up for success.
Create a safe word, or phrase, for your family to use that verifies your identities in an emergency situation.
Block spam calls and don’t answer numbers you don’t recognize.
Avoid posting videos of your family online publicly. Make your social media accounts private and only allow known family and friends to follow.
Ask others not to post videos of you or your children publicly on social media.
If you do get a suspicious call, take your time before reacting.
Ask to speak to the “victim” and get them to share the family safe word.
Confirm the safety and location of the person reportedly in danger. If you can’t reach them directly, call or text people most likely with them.
Report the threat to your local police and the Federal Bureau of Investigation (FBI).
AI-based forgery
For the past decade, programmers have tried to invent AI that replicates a person’s individual handwriting. In the fall of 2023, computer scientists in Dubai shared news of their success.
To test their AI, they conducted a study and asked their participants to spot the original sample of text from a generated version. Salman Khan, team researcher, said participants could not distinguish the mimicked handwriting from the actual handwriting.
While this new tech may benefit some, like those with injuries that prevent them from physically signing, it could also signal trouble.
Sign an agreement and your signature becomes the proof of your commitment to abide by the contract. The same goes for anyone who countersigns it.
AI handwriting deepfakes could spell devastating trouble for wills, property records, loans, checks, birth certificates, prescriptions, and on.
Protect yourself from AI forgery scams
Safeguard the precious commodity that is your signature and handwriting.
Refrain from posting anything with your handwriting on it publicly on social media or online.
Pay online rather than mailing signed checks.
Use a VPN and encryption to email documents with your signature on them.
Scan sensitive signed documents only on trusted, private machines.
Avoid signing petitions, especially those of new-to-you organizations.
AI phishing attacks
You’ve likely received a bogus email in which the sender tries to convince you to give them access to your personal information. The goal is typically to get your information and sell it on the dark web — or use it to pull off an identity theft crime.
Before text-generating AI, like Chat GPT, such phishing emails were easy to spot. Awkward language, misspellings, and illogical sentences usually gave the scam away.
With this technology, however, phishing scored a multilingual makeover, making it harder than ever to discern fraudulent emails and texts from legit ones.
Phishers can use AI-driven writing tools to make their messages sound more professional or omit keywords normally flagged as spam by your email provider. Also, they could scour your social media accounts and feed a few telling details into an AI chatbot, then direct it to spit out a deeply personal (and motivating) email.
Protect yourself from AI phishing attacks
Luckily, the same defenses against old-school phishing can come in handy against AI-enhanced phishing attacks.
Question any message you get that’s asking for data or anything personal.
Don't click on any suspicious links or open unsolicited attachments.
Check the sender's email address carefully, looking for any misspellings or inconsistencies.
We’re here to help you combat AI scams
AI is a powerful tool that presents endless opportunities for innovation and progress. But, given all these emerging scam trends, it's important to be proactive when it comes to protecting your identity and information.
As an Allstate Identity Protection member, you can contact our Customer Care team if you ever have any doubts about the legitimacy of a communication.
Whether AI-powered or not, we're here to help protect you from cyber scams.