AI is exacerbating the problem of robocalling for Americans.
- AI technology can imitate the tone of a familiar voice and engage in real-time conversation with you.
- Scammers can replicate the voices of roughly 52% of Americans who share their voice online, according to McAfee.
- An interactive voice response (IVR) is a type of spam known as voice phishing or "vishing."
The Federal Communications Commission's nearly $300 million fine against a massive transnational robocalling operation highlights the widespread issue of companies using prepaid cards to collect fines and fees.
If your CEO, spouse, or grandkid is on the phone urgently requesting money to get out of a difficult situation, what should you do?
The call becomes untrustworthy when AI imitates someone's voice and communicates in real-time.
Jonathan Nelson, director of product management at telephony analytics and software company Hiya Inc., stated that the phone system was constructed on the foundation of trust. "We used to believe that if your phone rang, there was a physical copper wire that we could trace all the way between those two points, and that vanished," Nelson said. "However, the trust that it implied did not exist."
With a quarter of all non-contact calls reported as spam, individuals must verify a lot of calls before trusting them.
McAfee's digital security report reveals that 52% of Americans share their voice online, providing scammers with the key ingredient to create a digitally generated version of your voice for "vishing" attacks. Spear phishing, once a time-consuming and expensive spam attack, can now be easily replicated using generative AI, making it more commonplace.
Steve Grobman, McAfee's CTO, believes that these types of calls are less likely to be spam than others, but they put victims in a more precarious position, increasing the likelihood of them acting. Therefore, it is crucial to be prepared.
Spotting AI scams
The success of preparing for the future relies on a combination of consumer education and the ongoing battle between white-hat and black-hat AI.
AI scam patterns are being detected by companies such as McAfee and Hiya, who are using historical call patterns to function like credit history for phone numbers and finding ways to prevent them.
Although the U.S. federal government led the investigation into the IRS scam (as discussed in the 2023 Chameleon podcast), its approach to addressing the use of AI in robocalling is unclear, according to one expert.
Kristofor Healey, a former special agent for the Department of Homeland Security, now works as CEO of Black Bear Security Consultants in the private sector. He previously investigated large-scale money laundering organizations and led the team that dismantled the IRS scam, the largest telefraud case in U.S. history.
According to Healey, AI tools used in businesses, including call centers, will increase the number of cases that need to be addressed by reactive systems such as the government and law enforcement.
Educating people about deepfake audio spam calls
Cybercriminals always take things to the next level, making it difficult for technology to be proactive. However, experts suggest that business and consumer education is the only truly proactive approach available, which involves educating people on how to protect themselves and those around them.
To protect themselves from deepfake audio spam calls, businesses should include education on this topic in their required employee cybersecurity training. Individuals can safeguard themselves by being more selective about what they share online. Grobman emphasized that risky behavior can have a greater impact on those around us than on ourselves. Criminals may use social media posts to create AI-generated voice-cloned calls and establish relationships with other victims.
As technology continues to advance, policies around employee behavior during non-contact calls and online sharing of personal data may become more prevalent. Meanwhile, identity protection and data cleanup services will remain valuable for consumers.
Grobman advises families to establish a secret word or code to verify that the person on the phone is truly a loved one, rather than using easily guessable information such as pet or child names.
If someone claims to be from a company, verify their identity by looking up the company's contact information and calling them directly. This ensures that the caller is legitimate and not a scammer. As Grobman emphasized, it is crucial to validate independently through a reliable source.
Healey acts as a sort of telefraud vigilante, always picking up the phone when a spam number shows up on the screen. He doesn't give them any confirming information, nor tells them who he is or any information about himself. Instead, he keeps them on the line as long as possible, costing them money as their voice-over-IP technology is at work.
"Preventing harm to others is effectively achieved by keeping them on the phone," stated Healey.
The IRS scam that Healey investigated and the podcast Chameleon: Scam Likely covered had tangible consequences on victims, including shame, financial insecurity, relationship breakdowns, and even death. Although spam calls may sound absurd to the trained ear, vulnerable individuals such as the elderly and those in fragile mental states have been and continue to be taken advantage of by the scam.
The use of AI technology to mimic the voices of our acquaintances, friends or loved ones makes the game more ingrained in the psyche. However, at some point, Chameleon notes, it ceases to be about the money, but rather the achievement, adrenaline, and power. Despite this, education on this ever-evolving threat continues, and technology helps to fight back.
technology
You might also like
- European SpaceX competitor secures $160 million for reusable spacecraft to transport astronauts and cargo to orbit.
- Palantir experiences a 9% increase and sets a new record following Nasdaq announcement.
- Super Micro faces delisting from Nasdaq after 85% stock decline.
- Elon Musk's xAI is seeking to raise up to $6 billion to purchase 100,000 Nvidia chips for Memphis data center.
- Despite a miss on sales, Alibaba's premarket stock rises 3%.