Hear a Loved One’s Voice Calling for Help? It Might Be a Scam
Imagine taking a phone call from a loved one who pleads for your help. It is likely you would be moved to do just about anything to get that person out of danger.
But thanks to artificial intelligence, that call might not be what it seems.
The Washington Post reports that rapidly advancing technology is making it possible for fraudsters to better mimic voices. The crooks use the technology to dupe people — often seniors — into believing that loved ones are in harm’s way.
According to the Federal Trade Commission, these so-called “imposter scams” were the second-largest reported type of fraud in 2022, with losses totaling $2.6 billion.
The Post reports that the fraudsters who perpetrate imposter scams are becoming increasingly convincing in their attempts to mislead:
“Advancements in artificial intelligence have added a terrifying new layer, allowing bad actors to replicate a voice with just an audio sample of a few sentences. Powered by AI, a slew of cheap online tools can translate an audio file into a replica of a voice, allowing a swindler to make it ‘speak’ whatever they type.”
The software used in these scams analyzes many aspects of a person’s voice — include pitch and timbre — to create a convincing audio forgery, the Post reports. Fraudsters can find the audio samples they need to create their scams on social media sites such as YouTube, in podcasts and in videos people post on TikTok, Instagram or Facebook.
Remaining vigilant is the best way to prevent yourself from falling prey to such a scam. According to the Post story, some of the things you can do are:
- Putting a call on hold if it sounds like you are talking to a family member or friend. Then, place a separate call to the person you are allegedly speaking with.
- Not giving assistance in the form of gift cards, which can be difficult to trace.
- Refusing to send cash.
For more on how to avoid becoming the victim of a scam, check out: