Fraudsters, scammers and AI tricksters… oh my! While these things aren’t lions, tigers and bears, they are pretty scary. AI scams have started becoming the latest tactic used to deceive people by using voice clones, realistic videos and personalized phishing messages. This makes it extremely hard to detect malicious intent and can cause significant concern when received. Let’s take a moment to discuss the types of AI-powered scams happening:
AI-Enhanced Phishing: Generative AI can help scammers create more convincing phishing emails, messages and fake websites that appear to be from legitimate organizations.
AI Chatbot Impersonation: Scammers use AI-driven chatbots to pose as customer service representatives or company officials, often creating real-time conversations to gain trust and obtain sensitive information.
Deepfakes: Scammers use AI to generate realistic videos or images of real people. These deepfakes can be used in video calls to impersonate executives or celebrities to spread disinformation.
Voice Cloning: AI can replicate a person’s voice from a short audio sample to trick victims into believing a loved one is in distress and needs urgent financial help.
While these are the more common scams, AI is still ever evolving and when deceptive individuals get their hands on these tools they use it for their own personal gain. Here are some ways to protect yourself:
- Verify urgent requests – It never hurts to verify that this is a legitimate request. You can always hang up and call them back on a trusted number or you can ask them to share personal details to verify their identity.
- Watch for inconsistencies- In voice calls listen for unusual inflections, odd word choices or strange background noises. If you’re dealing with a video call, look for unnatural movements, strange blinking or inconsistent lighting. With text and emails, be cautious of strange phrasing or requests for unusual payment methods such as gift cards or crypto.
- Limit online sharing – Reduce the amount of personal information that you share online through social media. Scammers will often use this data to create more personalized and believable scams.
- Enable multi-factor authentication – This will add an extra layer of security to your online accounts, making it harder for scammers to gain access.
- Trust your instincts – Often, if it sounds too good to be true that’s because it is. Always use your best judgement and if an interaction feels off, hang up.
While there are many ways to protect yourself, sometimes falling victim to these tactics can happen. If you have been a victim of an AI-powered scam you should act immediately with the following steps:
- Call the bank if you feel the scammer has access to your bank information or personal banking logins. Change all compromised passwords immediately.
- Place fraud alerts through the three major credit bureaus (Equifax, Experian and TransUnion.)
- Report the scam by filing a report with the Federal Trade Commission at reportfraud.ftc.gov.