7 out of 10 Indians are unable to identify AI voice call scams, and half fall for scams with monetary losses

Advertisement
7 out of 10 Indians are unable to identify AI voice call scams, and half fall for scams with monetary losses
Image credits- Unsplash
  • 69% of Indians cannot distinguish between fake and real voices.
  • Fake voice messages of friends and family members are sent by scammers who pretend to be distressed.
  • Below, read about some tips for identifying AI voice calls and how to protect yourself from AI voice call scams.
Advertisement
McAfee has released a report called ‘The Artificial Imposter,’ highlighting how the rise of artificial intelligence (AI) technology is facilitating an increase in online voice scams. The study was conducted in seven countries, including India, and involved 7,054 participants.

The survey shows that a majority of Indians, more than half (69%), cannot distinguish between an AI voice and a real voice. Nearly half (47%) of Indian adults have either experienced or know someone who has experienced some sort of AI voice scam, which is almost twice the global average of 25%. Additionally, 83% of Indian victims reported a monetary loss, with 48% losing over ₹50,000.

Voice cloning scams on the rise in India


According to McAfee, voice cloning scams are rising in India, with cybercriminals using AI technology to manipulate the voices of friends and family members. The report states that 86% of Indian adults share their voice data online at least once a week, making voice cloning an increasingly powerful tool in the hands of scammers. The study found that 69% of Indian adults cannot distinguish between a cloned voice and the real one, making it easier for scammers to trick people into falling for their schemes.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More
The most common type of voice cloning scam involves scammers sending fake voicemails or voice notes pretending to be someone in distress or even calling victims' contacts directly. According to the report, more than half (66%) of Indian respondents said they would reply to a voicemail or voice note purporting to be from a friend or loved one in need of money, particularly if they believed it came from a parent (46%), partner or spouse (34%), or child (12%). Messages claiming that the sender had been robbed, involved in a car incident, lost their phone or wallet, or needed help while traveling abroad were most likely to elicit a response.

AI voice scams can be difficult to detect, but there are some signs to look out for:
  1. Urgency: Scammers often create a sense of urgency, claiming they need help immediately.
  2. Request for money: Scammers may ask for money or request that you transfer funds to a particular account.
  3. Unusual behavior: The scammer may claim to be in distress or in an unusual situation.
  4. Changes in the tone of voice: The voice may sound different from the person you know, or the tone may be inconsistent with the situation.
  5. Unusual requests: The scammer may ask for personal information or login credentials.
It is important to stay alert and cautious when receiving unexpected calls or messages, especially if they involve requests for money or personal information. If in doubt, verifying the person's identity by calling them back on a trusted phone number is always a good idea.
Advertisement


Here are some steps you can take to protect yourself from AI voice cloning scams:
  • Be wary of unexpected calls or messages from friends or family members. If you receive a request for money or personal information, take a moment to confirm your identity through another channel of communication, such as texting or video calling.
  • Keep your personal information private. Avoid sharing sensitive details about your life on social media or other public forums that cybercriminals could use to clone your voice.
  • Use strong and unique passwords for all your online accounts, including social media and email. Enable two-factor authentication whenever possible.
  • Regularly update your antivirus software and update your operating system with the latest security patches.
  • Be cautious when clicking on links or downloading attachments from unknown sources. These could contain malware that could give cybercriminals access to your device and personal information.
  • Educate yourself on the latest cyber threats and stay up-to-date on recognizing and avoiding them.
SEE ALSO:
What are ‘vishing scams’ that start with a Namaste and end with monetary losses
Sony WH-CH720N headphones review: Featherweight comfort and impressive sound
{{}}