Share this:


Imposter Scams


Artificial Intelligence (or AI) is growing leaps and bounds....and so are the risks associated with it.

In 2022, impostor scams were the most commonly reported fraud reported in America, with over 36,000 reports of people being swindled by those pretending to be friends and family (Federal Trade Commission, Feb. 23, 2023).

These scams include bad actors using technology to mimic voices of family or friends, often times convincing people that their loved ones are in danger or distress, only to con them out of potentially thousands of dollars.

"The man calling Ruth Card sounded just like her grandson Brandon. So when he said he was in jail, with no wallet or cellphone, and needed cash for bail, Card scrambled to do whatever she could to help."
(Washington Post, 3/30/23: They thought loved ones were calling for help. 
?It was an AI scam.)

In the story above shared in the Washington Post, Ruth rushed to the bank to withdraw thousands of dollars, only to later learn she was duped by someone voice cloning her grandson, Brandon. Ruth told the Washington Post, “We were convinced that we were talking to Brandon.”

Advances in technology allow voices to be replicated with an audio sample taken from anywhere, including Instagram, YouTube, Facebook, or TikTok. A few sentences are all that's needed to translate an audio file to "speak" whatever the bad actor would like!

So how can you protect yourself from this scam? The FTC shares the following tips:

  • Be constantly vigilant.
  • If a loved one tells you they need money, put that call on hold and try calling your family member separately.
  • If a suspicious call comes from a family member’s number, understand that, too, can be spoofed. 
  • Never pay people in gift cards, because those are hard to trace, and be wary of any requests for cash.