Most of us have had or heard from a friend who has been the target of an email scammer pretending to be a friend in distress who needs money wired out of town or out of the country. Now scammers are using the telephone to inform you that your loved one is in distress. And the caller may sound “just like” your friend/relative. At that moment, your instinct would be to do anything to help them escape danger, including wiring money. My father was a victim of such a scam, but he called me first for advice. His “friend in trouble” was not in Scotland with a stolen wallet, passport, and a lump on his head; he was at his vacation home in Florida. A quick call to that residence and speaking with his friend foiled that scam.
Stop, think, and confirm before you do or commit to doing anything.
A recent report from The Washington Post featured an elderly couple, Ruth and Greg Card, who fell victim to an impersonation phone call scam. Ruth, 73, got a phone call from a person she thought was her grandson. He told her she was in jail, with no wallet or cell phone, and needed cash fast. As any other concerned grandparent would, Ruth and her husband, 75, rushed to the bank to get the money. It was only after going to the second bank that the bank manager warned them that they had seen a similar case before that ended up being a scam, and this one was likely a scam, too.
This scam is no longer an isolated incident. The report indicates that in 2022, impostor scams were the second most popular racket in America, with over 36,000 people falling victim to calls impersonating their friends and family. Of those scams, 5,100 of them happened over the phone, robbing over $11 million from people, according to FTC officials.
Generative AI has been in the media because of the increasing popularity of generative AI programs, such as OpenAI's ChatGPT and DALL-E. These programs have been mostly associated with their advanced capabilities that can increase user productivity. The same techniques used to train those helpful language models can be used to train more harmful programs, such as AI voice generators.
These programs analyze a person's voice for patterns that make up their unique sounds, such as pitch and accent, to recreate it. Many of these tools work within seconds, producing a sound virtually indistinguishable from the original source.
What can you do to prevent yourself from falling for the scam? The first step is being aware that this type of call is possible. See above: Stop, think, and confirm before doing anything.
If you get a call for help from one of your loved ones, remember that it could be a robot talking instead. To make sure it is actually a loved one, attempt to verify the source. I would hang up the phone immediately. If you are concerned, ask the caller a personal question that only your loved one would know the answer to. This can be as simple as asking them the name of your pet, family member, or other personal facts.
You can also check your loved one's location to see if it matches up with where they say they are. Today, it is common to share your location with friends and family, and in this scenario, it can come in extra handy
You can also try calling or texting your loved one from another phone to verify the caller's identity. You have your answer if your loved one picks up or texts back and does not know what you are talking about.
Red Sky Alliance is a Cyber Threat Analysis and Intelligence Service organization. For questions, comments, or assistance, please get in touch with the office directly at 1-844-492-7225, or email@example.com
Weekly Cyber Intelligence Briefings:
- Reporting: https://www. redskyalliance. org/
- Website: https://www. wapacklabs. com/
- LinkedIn: https://www. linkedin. com/company/64265941
Weekly Cyber Intelligence Briefings:
REDSHORTS - Weekly Cyber Intelligence Briefings