Scammers cloned billionaire Sunil Mittal’s voice with AI to con his executive: Report

Share This Post


Mittal recounted how the said executive stationed in Dubai received a fraudulent call that seemed to mimic his voice and tone and directed that a large fund transfer be done [File]
| Photo Credit: REUTERS

Scammers often use AI voice clones to con commoners to transfer money or disclose financial information but it’s rare when they try to impersonate a billionaire. This is exactly what scammers tried to do by cloning telecom czar Sunil Bharti Mittal’s voice to scam his executive in Dubai to transfer money.

Fortunately, the executive was smart enough to realise Mittal won’t ask for such a huge money transfer and the scam was stopped in its tracks.

On Monday, speaking at the NDTV World Summit, Mittal cited the incident to caution people about the risks posed by misuse of emerging technologies like AI.

Mittal recounted how the said executive stationed in Dubai received a fraudulent call that seemed to mimic his voice and tone and directed that a large fund transfer be done.

The official who was vigilant and “sensible” immediately realised it was a scam, Mittal said admitting that when he heard the voice recording himself he was completely “stunned” as “it was perfectly articulated just as I would speak”.

“And anyone who would not have been vigilant may have done something about it,” Mittal said and warned that in future misuse of technology would enable fraudsters to go a step ahead and misuse digital signatures, even replicate faces on zoom calls to perpetrate such acts.

“We’ll have to protect our societies from the evils of AI, and yet we have to use the goodness of AI, because those companies, and nations that will not adopt AI will be left behind. So this is a conundrum for every time you get a new technology into place, there are pluses and minuses. I remain very optimistic about the benefit of AI that the human race will achieve and be able to do jobs which are otherwise very difficult to perform,” Mittal said.

There are multiple instances of fraudsters nudging victims to click on malicious links, or using AI deepfakes and voice cloning for scams.

The sophisticated web of AI-powered deceit makes it harder to spot online scams, and fraudsters have been known to clone and mimic a person’s voice from even short audio clips scraped from video a person may have uploaded online.

Scammers then leverage the AI-cloned voice to pose as the person and demand money from friends and family.

At the same time, scammers are also using ‘digital arrest’ modus operandi, where they place audio or video calls, falsely pose as law enforcement officers, and use online intimidation to confine victims to their homes for extortion.

These elaborate and sophisticated scams involve cybercriminals using fake documents, replicating virtual courtroom or police stations as a backdrop, to place victims under ‘digital arrest’.

Recently, SP Oswal, chairman and managing director of Vardhman Group was defrauded of ₹7 crore by a gang that posed as officials from various government agencies.



Source link

Related Posts

China launches 3 Tianping-3 radar calibration satellites

HELSINKI — China launched a trio of satellites...

Nvidia: Nvidia offers to jointly develop chip with India

Nvidia has proposed developing a chip jointly with...

Intuit Begs Journalists to Delete Part of Interview With Its CEO

Karen energy.Fintech KarenIntuit, the financial tech giant behind...

AI’s Cassandra moment

AI systems may not be plotting to incinerate...

Elon Musk’s xAI API launches, letting developers build atop Grok

Join our daily and weekly newsletters for the...

How Google built the Open Buildings 2.5 Temporal Dataset

The team — centered in Ghana but spread...
- Advertisement -spot_img