/dq/media/media_files/2025/05/01/Nz0Pul8LaCD0oDap9RoP.jpg)
The US is currently under attack by a disturbing new AI powered scam targeting grandparents and families. Scammers rely on voice cloning tech to clone a loved one, such as a grandparent, and show an emergency so the victims send money. Criminals only need 3 seconds of an original audio to clone the voice, and reports confirm that these AI deep fake calls are alarmingly convincing. AI voice generator scam is coming fast to the world and making life easier in many ways like smart assistants or calls at customer service. However, what few people may realise is that there’s a dark side to voice cloning that’s not only rising sharply, but also alarmingly convincing and dangerous.
How AI voice generator scams work
Cybersecurity experts say scammers can clone just a few seconds of your voice that you left on social media, public videos and even voicemAIl greetings. According to a McAfee report, scammers can create a convincing clone of your voice with only 3 seconds of audio. Nowadays, a lot of people including adults and youth share their voice online, so the risk is everywhere.
The Modus Operandi
- Voice Samples Collection: Scammers search for your voice online on TikTok, Instagram, YouTube or any platform where you speak.
- With free or cheap AI tools, they clone your voice in minutes. These tools are widely avAIlable and easy to use.
- For making the Call, the scammer calls your family or friends pretending to be you or someone you know, and, in a variety of circumstances, poses as someone who needs money or help, such as an emergency.
- The voice is so real that even cautious people can be fooled, especially in stressful situations.
Why are AI voice generator scams so dangerous?
The danger with AI voice scams is that they can fool anyone, not just the elderly. Even tech savvy people, businesses and young adults are at risk. This is a global problem with cases reported in the US as well as India and many other countries. The impact financially is huge in 2025. AI voice scams are expected to cost victims worldwide around 7.76 billion dollars. Shockingly, 77 percent of people worldwide have been subjected to some form of AI voice scam and over a third have lost money to it. The figure that 77% of AI voice scam victims have lost money comes from a McAfee survey titled 'The Artificial Imposter'. In April 2023, this study surveyed 7,054 people in seven countries, including 1010 from India. Seventy seven percent of those victimised by AI voice scams experienced financial losses, with more than one third losing over 1000 dollars. In particular, in India, 83% of the victims suffered monetary losses. The fact that the technology scammers use is cheap, easy to access, and doesn’t need much skill makes it a growing threat everywhere.
The tools that are needed to make fake voices are cheap, easy to find and you don’t need any special skills to make them, which makes stopping AI voice scams really hard. Because scammers may be located anywhere in the world, from which it is hard for authorities to trace them and take legal action. Furthermore, AI technology changes at a very fast speed. It can now replicate entirely fake voices with crying, or sounds of panic, that sound completely human enough for many people not to doubt them.
Family emergency scams are real life examples, where the fraudster pretends to be someone in the victim’s family in distress and pressures the victim to send money urgently. Some pretend to be police, stating that a relative has been arrested and using cloned voices that sound convincing. Scammers use AI generated voices to masquerade as executives or clients to trick employees into transferring funds or sharing sensitive data to businesses as well.
How to protect yourself from AI voice generator scams ?
To protect yourself from AI voice scams, you need to be alert of any urgent calls from loved ones. Never call back the number and instead get in touch with them on their normal number or through another family member. Try to limit the online public audios that are shared on the internet. By doing so, your voice is public on YouTube, in videos, in recordings out into the world. It is also important for you to educate your friends and family, especially your older relatives who might trust these calls more easily. Also, always stay updated, as AI scam evolution is something you ought to be informed about to better safeguard yourself.
Voice AI is a powerful tool that is opening the door to new types of scams that most people won’t be prepared for. Because this technology is spreading, awareness and caution is more important than ever to prevent yourself and your loved ones from becoming victims of these convincing, unimaginable and costly frauds.