More Bad News About Cyber

Author: Jack Freund, Ph.D., CISA, CISM, CRISC, CGEIT, CDPSE, Chief Risk Officer, Kovrr
Date Published: 8 December 2021

If you clicked on this column because its title conveyed a negative tone, you are not alone. It is well known that negativity drives user interaction in online environments. Recent reports show that at one time, Facebook’s news feed ranking algorithm prioritized angry reactions five times as often as regular “like” reactions. Not all news aggregator services use this method of prioritization, but a recent episode of the Freakonomics podcast emphasized the role that negativity plays in the success of news outlets generally (spoiler: It is a big one). It would be naive to think that this bias in reporting does not affect cybersecurity news.

Negativity drives us to be more aware and ready for action. For those on the frontlines of cybersecurity, this is a natural state of being. It can be emotionally exhausting to be alert so frequently (renowned public-interest technologist Bruce Schneier has written about this phenomenon, known as “Code Yellow,” and I have authored a response). Being aware is a necessary first step and consuming negative news provides the opportunity to better understand what bad things could happen so that one can prepare accordingly. In many ways, this is why the month of October to is dedicated to cybersecurity awareness.

However, what is lost in this constant state of awareness is the understanding that, statistically, few enterprises are breached because most days the security team does its job well enough to block attacks. The Cyentia Institute's Information Risk Insights Study (IRIS) 20/20 report shows that approximately 25% of Fortune 1000 organizations have had a breach in the past several years. If one widens the scope to account for all organizations, including those that are not a part of the Fortune 1000, this number drops precipitously to as low as 2%. Sure, using the terms “few” and “most” to indicate that 75% of organizations are breach-free could be considered overly optimistic. Twenty-five percent is still a fairly large number (1 in 4 enterprises will have a breach!), but when working within the realm of cyberrisk quantification (CRQ) it is important to understand these numbers and be able to cast them appropriately for the audience to whom information is being presented.

CRQ requires data about frequency and losses. For some, this means polling the news media to get information about what is happening, how often and with how much loss. Sometimes these data are gleaned from industry associations, past experiences or relationships with people employed by other organizations. When accumulating data in this way for the purpose of building quantitative inputs, it is important to be aware of 2 biases that may come into play:

  1. Recency bias—This is the tendency to give preference to more recent events vs. those with a historical record. Recency bias is critical to pay attention to when building quantitative risk estimates, as a focus on recent events could cause one to overemphasize something that is historically rare. However, the trouble with this as it applies to cybersecurity is the relative newness of the field and the types of events that will be experienced. For instance, if cyberevent frequency was cast over the last 100 years, historically they would seem very rare. Obviously, the computing industry did not exist 100 years ago, so this fact is unhelpful. Conversely, recency bias can be useful especially in light of the sharp increase in ransomware events since 2020 as the COVID-19 pandemic has played out.
  2. Availability bias—Here the inverse problem can be observed: Just because there are news stories about data breaches that are reported does not mean that other events are not also occurring. I often talk about the data availability problems with cyberincidents. Mandatory disclosure laws have greatly improved the visibility of data breach events. But what about events that affect availability, integrity or fraud? Lack of mandatory disclosure around these kinds of events make accurately forecasting input ranges for CRQ analyses difficult. Uncovering better data around this requires organizational agreements with insurance carriers and brokers to unveil what the base rates look like for these types of events. That means that it is increasingly difficult for individual organizations to build defensible CRQ results based on individually curated industry data.

In the end, negative cybersecurity news is neither good nor bad. It simply must be assessed through a well-informed lens of bias filtering. Society will never get to a place where news articles will commend security organizations for their 100th consecutive day without a data breach. Indeed, as the statistics show, this happens far too often to be considered newsworthy. Those wins should certainly be celebrated internally. However, to gain better insight into the quantitative values that truly apply to an organization, it is necessary to partner with organizations that are able to provide data sets that are inaccessible to those simply scanning the headlines.

Jack Freund, Ph.D., CISA, CISM, CRISC, CGEIT, CDPSE, is vice president and head of cyberrisk methodology for BitSight, coauthor of Measuring and Managing Information Risk and a 2016 inductee into the Cybersecurity Canon. He is also an ISSA Distinguished Fellow, a FAIR Institute Fellow and an IAPP Fellow of Information Privacy. Freund was the recipient of an (ISC)2 2020 Global Achievement Award and ISACA’s 2018 John W. Lainhart IV Common Body of Knowledge Award.