AI-Powered Voice Scams Are On The Rise: Here Is How You Can Stay Safe!

McAfee Report Suggests That India Tops The List For AI-Powered Voice Scams At 83%

With the skyrocketing popularity of Artificial Intelligence, the manipulation of images, videos and voices of family and friends has become a cakewalk. Cybercriminals are utilising AI-powered voices to target naive individuals, and India tops the list of victims, with 83% of people have lost money to AI voice scams. Approximately 48% of these people lost cash over Rs 50,000.

Scammers deploy AI tools to sound familiar sounding voices, like those of friends and family in distress, to coax people into lending money. A McAfee report suggests that over 69% of Indians find it challenging to distinguish the voices of AI and authentic voices.

Additionally, the report titled ‘The Artificial Imposter’ revealed that the number of Indian adults who know someone who has faced a voice scam stands at an alarming rate of 47% which is almost double the global average of 25%.

The report highlighted that online voice scams are accelerated by the rise of AI tech, which takes less than 3 seconds to clone a voice. The survey had 7054 participants from seven countries, including India.

Artificial intelligence unlocks countless opportunities, yet there is also an underlying threat for it to be misused by the wrong people. The accessibility and ease of using AI technology have aided cyber criminals in duping people out of large sums of money in several unimaginable ways, said McAfee CTO Steve Grobman.

The Dangers Of AI Cloning Voice

Given that everyone’s voice is distinct, it would not be wrong to say that voices are the equivalent of biometric fingerprints. However, 86% of Indians share voice recordings on social media platforms making voice cloning easy and a prominent tool for cybercriminals.

The McAfee report also revealed that around 66% of Indian respondents agreed to reply to voice notes from acquaintances needing money. Further bifurcation of this number indicates that 46% respond to requests from voices sounding like their parents, 34% partners or spouse, and 12% children.

Read | 10 Best WhatsApp Alternatives You Must Try

Voice notes that evoked responses mostly pertaining to the ones where the sender claimed to have been robbed (70%), suffered a car accident (69%), lost their phone or wallet (65%) or required financial help for travelling abroad.

The propagation of deepfakes and misinformation has cautioned people regarding the authenticity of online content. The report revealed that 25% of Indians refrained from trusting social media platforms, and 43% were apprehensive over the surge in disinformation and misinformation.

About The Author

Kumkum Pattnaik
Kumkum's unparalleled love for gadgets is what drives her to research, scrutinize and pen down tech-related content from every corner of the world. Whether it is getting her hands on the latest electronic devices or reading voraciously to find what tech mammoths are up to, she makes sure that her inventory is up-to-date. View More Posts