‘Save Me, Mom’: How To Spot AI Voice Cloning Scams | Tech News


Last Updated:

An elderly woman from Delhi lost Rs 2 lakh after she received a distressing call from her “daughter-like voice” who claimed that she had been kidnapped

font
News18

News18

One evening, 65-year-old Radha Devi, a resident of Delhi, received a distressing phone call that left her shaken. The voice on the other end sounded exactly like her daughter’s. Through sobs, the caller cried, “Mom, I’m in trouble… I’ve been kidnapped… Tell Papa to send money, otherwise…”

Panicked and overwhelmed, Radha did not pause to verify. Believing her daughter’s life was at risk, she immediately transferred Rs 2 lakh through UPI.

Hours later, she discovered her daughter was safe at home. The call had been a scam.

Fraudsters had used artificial intelligence (AI) to clone her daughter’s voice, reportedly using just a few seconds of audio available on social media. What felt like a personal nightmare is becoming a disturbing reality across India, as AI voice cloning scams rapidly spread and criminals impersonate family members to emotionally blackmail victims.

AI voice cloning is a technology that can replicate a person’s voice with startling accuracy. Scammers often extract audio clips from videos posted on platforms like Facebook, Instagram or WhatsApp. With advanced tools, a voice can be copied within 3-5 seconds.

Using unknown numbers, fraudsters then call targets and spin urgent stories of accidents, kidnappings or arrests, pushing families to send money immediately. Cybercrime experts warn that such AI-enabled fraud could rise by nearly 40% in 2026.

Women and the elderly are among the most frequent targets, as scammers exploit emotional vulnerability. International data reflects the growing scale of the threat. In the United States, people aged 60 and above reportedly lost $4.9 billion to similar scams in 2024.

Incidents involving elderly women have also been reported globally, including a case in Florida where a mother lost $15,000 after hearing what she believed was her daughter’s voice. In another case from Arizona, a woman was convinced her daughter had been kidnapped, only to later learn it was an AI-generated imitation.

Authorities, including the FBI, have warned that the “believability” of AI-driven scams has increased significantly, making them harder to detect. As a result, cyber experts stress the need for greater caution and awareness, particularly among women and senior citizens who may be more likely to respond emotionally in moments of panic.

Safety guide for women and the elderly:

1. Create a family code word: A secret word known only to close relatives. If you receive a distress call, ask for the code word. If the caller fails to answer correctly, treat it as a potential scam.

2. Always verify the caller: Disconnect the call and try contacting your loved one directly on their personal number. Never send money based on an emergency call from an unknown number.

3. Try to detect AI responses: Ask unexpected personal questions like, “What did we have for dinner last night?” or “Where did we last meet?” Hesitation or incorrect answers can signal a fake call.

4. Protect your social media privacy: Avoid posting clear voice clips publicly and keep your profiles restricted to prevent misuse of audio content.

5. Never share bank details over the phone: Do not disclose your UPI PIN, OTP or CVV to anyone. If you suspect fraud, immediately report it to the cybercrime helpline at 1930 or inform the police. Caller identification and security apps like Truecaller or McAfee can also help flag suspicious calls.

Experts say such scams not only cause financial loss but also take a serious emotional toll. Many women and elderly people live alone for long hours, making family awareness and regular communication essential. As artificial intelligence becomes smarter and more convincing, staying alert and informed remains the strongest defence.

News tech ‘Save Me, Mom’: How To Spot AI Voice Cloning Scams
Disclaimer: Comments reflect users’ views, not News18’s. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy.

Read More



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *