ransomware

54% of cyberattacks reported globally in 2020 were caused by the actions of ransomware, malware or malicious messages that reach us on the device and after a computer has been compromised, hijack information and demand payment of a ransom to restore data and cause other collateral damage avoid .

90% of these types of attacks are caused by a user clicking on a link that contains malicious code. According to an EY study, more than 90% of cybersecurity incidents originate from human error, largely due to the ease with which cybercriminals exploit human vulnerabilities by knowing how cognitive biases work, a psychological phenomenon of the mind, mostly unconsciously.

According to some studies and research, people make an average of about 35,000 decisions a day, of which only 91 are conscious. Our brain does the rest, using mental shortcuts or cognitive biases, including whether or not to click on a malicious link in an email we receive.

Cognitive biases are part of human nature, so we cannot eliminate them from our teams and organizations, but we can try to control them. as mentioned Anthony FernandesHacker and cybersecurity multiplier, “when a criminal group selects a target, behind a cyber intelligence investigation of the company there has been a profile of its employees and a study of their working methods to define, among other things, how to improve them the chances of success”.

The prejudices that condition our brains and ransomware open the floodgates

Knowing how our brain works and what its main vulnerabilities to prejudice are opens a new dimension in the recognition and development of behavior. In that sense, the first Study on cognitive biases and ransomware out of Aiwin has found more than 30 biasess concrete cognitive insights showing that as part of an organization’s cybersecurity culture, “think before you click” isn’t as easy as reminding the employee over and over again.

Here are some of them:

Bogus truth effect: Our brain finds it easier to process information that we have previously experienced. This creates a feeling that can lead us to misinterpret a signal as true content. In this way, cyber criminals can let off steam phishing Using the principle of cooperation, reciprocity and trust.

Selective perceptual distortion. It occurs when the person receives information and, depending on his expectations, automatically chooses one object of attention and ignores the rest, so as not to become saturated. With its activation, you can indulge in practically any social engineering technique.

follower effect. It occurs when the brain makes decisions based on emotions and group impulses. For example, it is activated when we follow what our colleagues are doing, assuming it is safe or useful. When someone sends a link to a work chat and more people respond to it, the fear of missing out and being “out” from clicking the link can overwhelm us.

automation distortion. It arises when our brain trusts information provided by an automated system more than information provided by a non-automated system, such as B. Information collected from an individual, even if it is correct. With it, you can fall prey to almost any social engineering technique, but especially those that take advantage of the principle of urgency.

optimism biased or illusion of invulnerability. The human brain is programmed to be generally optimistic and often underestimates the likelihood of adverse events. An example of the impact of this bias is when an employee thinks, “The company will never be harmed by a click I make on an email.”

Aiwin recognizes the need for organizations to address issues such as cognitive biases aiwin firewall, a platform that automates the generation of cultures in cybersecurity.

Cyber ​​crime, ransomware