In a world where technology is booming, people are reaping many benefits. However, we cannot ignore the dark side of these innovations, like deepfake, which can be used for fraud and exploitation.
Deepfake is a media system created with artificial intelligence (AI) to show fake or falsified results. Specifically, deepfakes allow people to create surprisingly realistic representations of a person’s appearance or voice, like in the video below.
This approach makes it even easier for cybercriminals to manipulate users into providing confidential data in a variety of situations, including social media platforms. Deepfakes could even be used to create convincing messages that appear to come from a loved one.
This approach can be used to steal information, including photos, addresses, or contact numbers, from social media. Now, they can easily approach any person with the information they have. Scam attackers will tell you something personal or related to you to convince you they know you personally.
This article discusses how deepfake is used for financial scams, often to deceive elders, while highlighting common scams and protection measures.
Why is Deepfake Significant?
According to a report by the FBI in 2022, there were about 88,262 cases of fraud reported, in which most victims were more than 60+ years of age; the total loss was approximately 3.1 billion dollars. According to the New York Times, victims of this fraud are mostly aged 70, losing an average of almost 42,000 dollars.
These stats show that such fraud is already a huge issue. Increased effectiveness and availability of deepfake technology may lead to even more financial fraud and perhaps higher losses for seniors.
The newness of the technology is a notable concern. Most people are accustomed to trusting what they see and hear, but deepfakes mean that anything could potentially be falsified. Seniors may find this particularly difficult to believe and look out for, especially if they are not very tech savvy.
Stay Alert for these Common Scams
AI-Generated Voice Messages
In this modern era, it has become quite simple to replicate someone’s voice. This is most commonly done through a pre-recorded message. However, some technology does allow for real-time voice replication that can be surprisingly convincing.
Such practices are deeply concerning, given that many scams can be conducted over the phone.
Robocalls Scams
Robocalls are pre-recorded messages commonly used by political campaigns, telemarketing companies, and organizations. They can also be switched to live agents if someone needs to talk. However, such calls can also be used for fraudulent activities, and tracing robocalls is difficult due to advanced technology with which scammers alter the call’s location using specific tools.
A recent example involved robocalls imitating President Joe Biden, which aimed to discourage people from voting and told them to “save your vote for the November election”. Two companies from Texas have been implicated in the robocall and an investigation is underway to trace these robocalls.
Don’t be fooled. This same technique can be used to get political campaign donations from the public.
Fake Emails or Text
When attackers send fake emails or texts, they often assume the identities of reputable company owners appear trustworthy.
Then, they contact their employees and other individuals, requesting them to transfer money to the fake accounts they typically use for such purposes. When individuals request original accounts, they will claim their original account is blocked or malfunctioning.
Deepfake technology provides extra opportunities for such scams, potentially allowing the scammer to create video or audio in which they seem to be the company owner. The scammer could use such technology to address the target directly, making their claims much more convincing.
Face-Swapping During Video Calls
Scammers may use face-swapping technology to impersonate individuals during Zoom meetings, gaining the victim’s trust and requesting money transfers to a designated account.
This approach is a little different than other forms of deepfake, as it doesn’t require pre-training a model. Because of this, the scammer can be more responsive to their target.
The technology for live face swapping isn’t well-developed yet and the scam is pretty easy to spot. Still, it may be enough to fool some seniors, especially if their eyesight isn’t good or they don’t know that face swapping technology exists.
Romance Scams
Scammers create fake profiles on dating websites or social media, targeting older adults seeking companionship, ultimately luring them into romantic entanglements. When it is believed that trust has been established between them, the scammer will demand money for personal things or to travel to see them.
Romance scams often rely on text-based or phone conversations so that the scammer doesn’t give themselves away. However, the rise of deepfake technology is also allowing scammers to create fake videos that make them more convincing than ever before.
This is deeply concerning, especially as lonely seniors are easy targets for romance scammers. It’s surprising how many people fall for these scams, even when there are clear signs that things aren’t as good as they seem. Part of the reason may be that lonely people want the romance to be real, so they ignore any signs of a problem.
Deceptive Customer Support Scheme
Representing themselves as a specific company or organization member, scammers request sensitive information such as user IDs or passwords under the guise of offering giveaway gifts or schemes.
One common tactic employed is to appear as if they are protecting your bank account from potential scams when, in truth, they are scammers. Legitimate organizations typically do not request this information. This is another approach that can be conducted with or without the use of deepfake technology.
Grandparent Scam
This is often regarded as an emotional scam in which attackers call an older person and pose themselves as a grandchild who has not called you for the last many weeks or months and requires urgent financial help due to various distressing situations such as medical emergencies, being in jail, or being kidnapped.
Later, you know it was not your grandchild and was just a scam.
Such scams can be very convincing, as they tug on people’s heartstrings, making them less likely to be rational. Victims may give money immediately, especially if there’s a sense of urgency, without verifying the situation first.
Deepfakes make this issue so much worse, as scammers are able to make it sound like a family member truly is in trouble.
A recent example of this approach involved a woman from Arizona named Jennifer DeStefano. She got a call, and the voice was identical to her teenage daughter, Briana. She was crying and seeking help to release her from the kidnappers by giving them money. Thankfully, the real Briana contacted her mother before the ransom was paid, but the situation could have
Protection and Awareness
Financial scams are becoming more common, making it crucial to protect and raise awareness about them. Older people, who are often targeted due to their trusting nature, should take precautionary measures to avoid losing their savings.
- Recognizing AI-generated voices can be tricky as they closely resemble human voices. However, there are some potential giveaways. In particular, they may sound dry and emotionless, butcher or elongate some words, or you may hear some static in the background. Issues may be particularly noticeable for emotional responses.
- Con artists try to obtain sensitive or personal information by claiming to be your relative, friend, or representative of the organization you are related to. Please keep this information to yourself.
- Scammers often pressure the target to act quickly and claim immediate action is required; otherwise, you will miss out on an opportunity or incur significant losses.
- If you have any doubts, you must ask specific questions that only the original person would know.
- You can also try to contact the individual or organization using the original contact information.
Some Valuable Tips
Stay Informed
Education and knowledge about these financial scams are crucial in protecting oneself and loved ones. Remind older adults to avoid sharing personal information with strangers and advise them to validate by discussing any concerns with a trusted family member.
Use Safe Words
It is essential to have safe words known only to your family members. If you get any call claiming your grandchild, ask for safe words to identify. Be alert if the caller does not know the safe word; it can be a scam.
Verify Before Responding
Avoid quickly responding to voice messages from unknown numbers. Listen to them carefully, verify their authenticity, and only then consider responding.
You might try looking up their phone number online to see if it’s already been flagged as spam by others. If they claim to be from a known service provider about your account, call that company’s known customer service line to verify.
Avoid Clicking on Unnecessary Links
As a general rule, always avoid clicking on the links you have received through emails or text messages. Many people click on links instinctively, and this practice needs to change. Ensure that the email is legitimate and doesn’t appear to be fabricated to look legitimate.
So, if you get a email that seems to come from your bank with a link, don’t click on the link. Instead, go to your bank’s website and login as you normally would (or call them).
If, for any reason, you do click on a link – make sure you don’t enter any personal details.
Use Multi-Factor Verification
For enhanced account security, it’s best to set up two-step verification and a strong password. This way it’s much harder for people to access your accounts. You should also regularly monitor any suspicious activity on the bank account or statements.
When seniors can’t do this themselves, a family member may need to keep an eye on their accounts instead.
Look For Fishy-Looking Email Addresses
Emails with too many numbers, lots of lowercase letters, and numbers where letters should be are red flags. The subject may not make sense, or they use public email providers, such as Gmail or Hotmail, when a more authoritative email is expected.
In particular, government departments and most businesses won’t use gmail.com or hotmail.com email addresses. So, an email from [email protected] is certainly a scam.
Conclusion
In a world inundated with technological advancements, the rise of deepfake technology poses a significant threat, as explored in this article. From financial scams targeting vulnerable elders to sophisticated impersonation tactics through AI-generated voices and video calls, the methods employed by scammers continue to evolve.
Misuse of deepfake for fraudulent activities, such as financial scams targeting vulnerable elders, cannot be overlooked. Scammers are using sophisticated impersonation tactics through AI-generated voices and video calls. Protective measures against such scams are imperative.
Feeling Overwhelmed?
Check out our Caregiving Consulting service for personalized support and guidance.
Leave a Reply