Florida's attorney general says AI is making scam calls worse
Florida Attorney General Ashley Moody is warning that artificial intelligence (AI) is being used to carry out imposter scams, which often target grandparents.
She says A.I. voice-cloning technology makes these scams even more convincing.
“Never rely solely on what you’re hearing over the phone or seeing on caller ID. Ask the caller for answers only the person they are mimicking would know,” Moody says in a video on her office’s website. “Call the loved one back on their personal phone line, and immediately contact local law enforcement if you believe a loved one is in danger.”
She says fraudsters are using AI to deceive victims into believing a call is from someone they trust in order to steal personal information or money.
Here’s the press release:
TALLAHASSEE, Fla.—Attorney General Ashley Moody is launching a new Summer Scams Series, called Tech Traps, to focus on ways scammers may use technology to target Floridians. The first installment in the series covers artificial intelligence voice scams where scammers will use sophisticated algorithms and voice-synthesis technology to mimic human voices and deceive victims into believing a phone call is from a known individual.
The Federal Trade Commission recently reported that scammers are using AI to enhance a version of the grandparent scam. Using the new technology, the scammer pretends to have kidnapped a family member in an effort to extort money from relatives.
A recent article tells the story of a mother receiving a scary phone call where a criminal claimed to have her daughter and impersonated the daughter's voice using AI technology.
Attorney General Ashley Moody said, “As a mother, I cannot imagine the fear that would come from hearing your child’s voice begging for help over the phone and a scammer threatening to kill your child. AI voice cloning poses a grave new threat, as it allows scammers to exploit our deepest fears to deceive and manipulate us. These scams can result in the loss of large amounts of money from people hoping to save a loved one.”
AI voice scams use voice technology to mimic human voices and deceive targets into believing a call is from a trusted individual or organization. Fraudsters can also use this technology to impersonate government agencies, financial institutions, customer support services and victims’ loved ones. Scammers utilize this method to steal personal information or money.
Floridians who stay cautious and withhold information while on suspicious phone calls may avoid falling prey.
Attorney General Moody offers the following tips to avoid falling for AI voice scams:
- Ask personal questions: If a caller purports to be a loved one, ask the person a question only the supposed friend or relative knows;
- Verify a caller’s identity: Never rely solely on caller ID and call a loved one back on a personal line to verify a caller’s identity;
- Be skeptical of urgent requests: Scammers often create a sense of urgency to pressure victims into making hasty decisions. Take time to assess the situation and verify the authenticity of any urgent request; and
- Contact law enforcement: Let authorities know if someone claims to have kidnapped a relative.
Report AI voice scams to local law enforcement, as well as to the FTC by clicking here.
To view other Consumer Alerts, visit MyFloridaLegal.com/Consumer Alert.
Copyright 2023 WFSU. To see more, visit WFSU.