"Human systems" are eminently vulnerable in comparison to computer systems. It has been observed that many cybersecurity issues are due to cognitive bias. Cognitive bias is an error in the thinking process that affects decision-making.
Studies on the subject show that attackers target cognitive bias in combination with technical flaws to exploit a system successfully. According to the Willis Towers Watson Cyber, Risk Culture Survey1 two-thirds of security breaches are down to human behavior.
Below are a few cognitive biases that can impact an organization's cybersecurity.
Optimism Bias (Illusion of Invulnerability)
The cybersecurity domain demands pessimists, not optimists. By default, the human brain is wired to be optimistic, and we often underestimate the likelihood of adverse events taking place. Optimism bias causes people to think in an overly optimistic way.
While this outlook is great for everyday life, this opposite is required for a system, application, incident, email, link, and attachments, to name a few. This is because one misstep is potentially disastrous and can compromise the entire security of an organization.
The overconfidence effect is observed when people's subjective confidence in their ability is more significant than their objective (actual) performance.
There are reasons to believe that the sinking of the Titanic was very much due to this bias. The overconfidence effect can trick people into making bad decisions and create a false sense of security. Overconfidence in the security posture of an organization because of firewalls, antivirus, honey pots, or IDPS, among other defense measures, could ultimately result in severe security breaches.
The ostrich effect is a cognitive bias that represents an individual's natural tendency to try to avoid situations that they perceive as negative. The bias is named after the myth that ostriches bury their head in the sand when they sense danger. This isn't factually correct but still very relevant to human behavior. This bias could be catastrophic if it causes information security management to avoid solving the problems just because they don't want to deal with it. Ignoring security recommendations and best practices due to the cost and effort required for implementation is a typical example of this bias.
Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or strengthens one's prior personal beliefs or hypotheses. Confirmation Bias can trick the mind into only looking for specific issues related to an IT infrastructure, based on one's previous experience and understanding, instead of considering security as a whole. The unwritten rule in InfoSec - "never assume," is the best antidote for this bias.
Hyperbolic discounting is the tendency to choose smaller rewards sooner than preferring more significant rewards later. Often, engineers consider security as a roadblock. If programmers, system engineers or database administrators consider following the security best practices but then choose to ignore it for the sake of meeting timelines, then it falls under the hyperbolic discounting category.
In medicine, placebos are inert tablets or injections that do not have any therapeutic value but can create a sense of improved health or even result in the actual cure of the disease. The mindset with the Placebo effect could be dangerous since it can create a false sense of security.
Here a placebo could be:
• A firewall without any defined rules
• A vulnerability scanner that cannot identify advanced vulnerabilities
• A security analyst who does not have proper information security skill sets
• The act of outsourcing enterprise security to the wrong security firms.
Parkinson's Law of Triviality (Bikeshedding)
Parkinson's Law of Triviality says that crucial matters go unattended due to spending more time on unimportant things. Security monitoring tools with a high rate of false-positive alerts result in time wastage and can cause the bikeshedding effect. Other examples are devoting more time and efforts for less critical assets and wasting time on unnecessary discussions and meetings.
This bias is all about making decisions based on the recent trends or the information which comes to mind first. In the case of an attack, if the focus is only given to a specific issue rather than considering the security as a whole, then the availability heuristic is in place. For example, during the event of incidents like Heartbleed, Poodle, Wanna Cry etc., a lot of attention was given to those specific issues, but in reality, there could have been other issues that required more attention than the trending ones. Though it is crucial to take necessary steps to prevent the trending attacks, it's equally important to continue with other security considerations.
This is a cognitive bias in which an individual considers his ability to be greater than it actually is. It is believed that most people think they will not become a victim of security incidents, which is far from real. The Dunning-Kruger effect can lead to phishing, malware infection, spoofing, and vishing, among others. During a phishing simulation exercise, you can see that many people, including security professionals, fall for the bait. This correlates with the Dunning-Kruger effect.
The ambiguity effect is a cognitive bias where decision making is affected by a lack of information or ambiguity. Due to this bias, individuals tend to act exponentially less on a specific subject or area due to ambiguity or lack of knowledge. The reluctance of the system owners to upgrade the systems or to apply the latest patches, an unwillingness of end-users to configure security features and the lackadaisicalness of developers to add new security features to an existing application are all examples of this bias.
How to overcome the Cognitive Biases that impact cybersecurity
The first step to overcome cognitive bias is to understand and to acknowledge that you have it. Employees need to be continuously trained in information security. Additionally, various social engineering simulation exercises need to be conducted on an annual basis. It is important to note that security is not a product but a culmination of process and technology, which depends on human behavior.
Ensure that your organization has proper plans, procedures, tools, guidance, and training to identify, analyze, monitor, and manage advanced cyber threats. The InfoSec rule of thumb - "never assume" - can be applied as an antidote to overcome cognitive bias. The monitoring tools that are being used for cyber security management must have the capability to understand human weakness and provides proper analysis based on user behavior. The addition of User Behavior Analytics (UBA) to your security management will also ensure that your company has top-notch cybersecurity.
Paladion's Managed UBA is a part of the Managed Detection & Response (MDR) service to identify risky behavior users may be about to engage in accidentally, and unauthorized usage of legitimate user credentials that have been stolen or inappropriately shared.
- Pallier, G., Wilkinson, R., Danthiir, V., Kleitman, S., Knezevic, G., Stankov, L., & Roberts, R. D. (2002). The role of individual differences in the accuracy of confidence judgments. Journal of General Psychology
- Plous, Scott (1993), The Psychology of Judgment and Decision Making
- Croskerry, Pat; Cosby, Karen S. (2009). Lippincott Williams & Wilkins. Patient Safety in Emergency Medicine.
If you are interested in learning how to bring these defenses to your Technology organization, reach out to Paladion today.