Do not fall for “Deepfakes” Ransomware scams. Learn how to identify it HERE!
Throughout history, technological advances have proven capable of globally impacting society. In the specific case of technology based on artificial intelligence, it has increasingly sophisticated and accessible processes for increasingly informed, connected and demanding users.
In such a way that the possibility of intertwining reality and Deepfakes increases vertiginously, making cybersecurity an aspect that is becoming more and more important in the personal and organizational sphere.
Thanks to Deepfakes, it is becoming easier to almost reliably falsify audio and video formats, which have the potential Ransomware to be used for extortion, fraud and to circumvent the security systems of individuals, public institutions and companies of various kinds around the world.
Audio for malicious purposes.
The forcefulness of Deepfakes for criminal use was revealed when in 2017 they were used to damage the reputation of individuals with the publication of false pornographic material.
Today, the popularity of the audio format also represents a threat to the security of citizens and organizations, because it can be used by cybercriminals to mislead the people who receive them.
For example, an audio deepfake of a company executive can be used to financially dupe the company. And it is that Deepfakes technology offers the possibility of creating very credible audios from small fragments of conversation of the person who wants to impersonate.
Likewise, obtaining audiovisual material with explicit content continues to be very attractive for this type of crime. Another modus operandi consists of obtaining real sexual videos of the victims, under the deception that it is a private video call. Which are later used to extort them.
How to identify them
As has already been mentioned, the advances in artificial intelligence make it increasingly complex to differentiate a real material from a Deepfake, however, it is convenient to take into account the following:
• Develop critical thinking.
It implies becoming aware of our own cognitive biases at the moment that information reaches us in an audio or video presentation. It is necessary to apply common sense and compare the information with the original sources, before executing any action or interpretation.
• Check the veracity of the sources.
Many of the scams that are made using audio deepfakes occur in real time, under these circumstances, it is necessary to build the habit of following up on calls that generate mistrust, it can be by returning them or questioning the person who makes them.
• Find bugs.
This is specifically applicable to video format Deepfakes. One of the easiest ways to suspect if you are in the presence of a possible Deepfake is to see if the image does not flash as fast as a human being. So far, fakes have not been able to improve this detail.