Criminals are taking advantage of artificial intelligence with an impersonating scam

AI capabilities allow crooks to impersonate anyone with just a few seconds of audio.

New technologies offer great possibilities to users who wish to make good use of it. tools like ChatGPT, for example, are a good starting point for people who want to write essays, write code, or learn about different topics. However, the rise of options such as artificial intelligence has led to an increase in the number of scams which, taking advantage of the virtues of this technology, try to fake the voice of loved ones to get money.

As revealed technological pointa canadian couple 70 years He received a call from “his grandson” a few days ago. After telling them he was in jail, he asked them to post bail of 6,000 Canadian dollars (a little more than €4,100) to be able to get out of prison, a situation that led the couple to withdraw $3,000 from a bank and do the same in another entity. But luckily when they arrived the manager informed them that it was an increasingly popular scam and informed them that it had already had to stop similar cases by other affected users.

Artificial intelligence speeds up fraud processes

As the same newspaper reports, a year more than 36,000 people They denounce having been scammed by people pretending to be friends or family. Of these, more than 5,100 They claim that it happened over the phone, a situation that could go further given the possibilities of artificial intelligence. And, as we told you a few months ago, tools like Microsoft’s new AI is able to reproduce any human voice from a three-second sample, a fact that has bothered users for the various problems related to identity theft that may result from this practice.

Unfortunately, AI has helped scammers to speed up the process in order to carry out these scams. Years ago, scammers needed several hours and different audio tracks to get a convincing result. Today, with the possibilities of artificial intelligence, in a few minutes and with a sample of a few seconds, they can successfully impersonate whoever they want. And, this way, they already have a new scam method that adds to popular whatsapp channels either big business phone messages.


If you want to hear about more news like this and want to discuss it with other community members, join the 3DJuegos Discord Server to be aware of all our news.

Main image by Alexander Mils (Unsplash)

See all comments on https://www.3djuegos.com





Shawn Jacobs

"Incurable alcohol evangelist. Unapologetic pop culture scholar. Subtly charming webaholic."

Leave a Reply

Your email address will not be published. Required fields are marked *