The Assault on Truth

Author: Diana L. Burley, Ph.D.
Date Published: 1 March 2021

Truth. Fact—a statement in accordance with reality.1 For years, there has been an assault on truth, whether through the promotion of “alternative facts,” the normalization of seemingly insignificant lies, the sharing of half-truths that mischaracterize reality or the confounding of fact and opinion. The slow, methodical assault on truth poses a clear and present threat to society. It destabilizes systems, encumbers action, erodes trust and hinders our ability to adjudicate markers of reality.

These vulnerabilities can manifest in physical and virtual spaces, and while attention is often focused primarily on the security measures necessary to prevent damage to physical property, the progressive vulnerability of our digital systems is also of great concern. To be sure, cybersecurity professionals will continue working to assure the confidentiality, integrity and availability of systems and the data they house. The essence or quality of cybersecurity practice is not the subject of this narrative. Rather, the focus here is on the impact that the assault on truth has on perceptions of system integrity (as a component of cybersecurity) and, thus, on the practice of cybersecurity professionals.

Perception frames mindset and forms the basis of an individual’s interpretation of people, places, things and events. It shapes the lens through which we experience reality and, while perception is not reality, perceptions do seem real and motivate action. The assault on truth is designed to alter perceptions of reality and, when successful, disinformation campaigns widen the gap between perception and reality. 

What Does This Mean for Assuring System Integrity?

Assuring system integrity assumes shared belief in a singular reality—the state where data are complete and free from unauthorized modifications. This basic assumption is weakened in the face of massive disinformation campaigns. Instead of working from a singular reality, lenses narrow and angle individuals toward multiple realities. Over time and within echo chambers on and offline, targeted communications hone the skewed perceptions of reality and weaken the ability of individuals to adjudicate objective truth. Trust is eroded, data integrity is questioned and belief in system integrity grows increasingly unstable as the systems are perceived to be insecure regardless of the actual state of cybersecurity. This scenario is very clearly playing out in the public space as questions surrounding the integrity of election systems continue to swirl despite assurances from agencies such as the US Cybersecurity and Infrastructure Security Agency. The agency is tasked with securing these systems and has asserted that “[t]here is no evidence that any voting system deleted or lost votes, changed votes or was in any way compromised.”2 

In this context, we must consider the following implications for cybersecurity professionals:

  • What will be the impact on cybersecurity practice if the gap between reality and perception continues to widen? 
  • How do we adjudicate markers of reality and how do we aid the adjudication process, particularly in light of technological advances that exacerbate challenges to integrity (e.g., deep fakes)?
  • Should categories of adversaries include purveyors of disinformation and, if so, how might this expanded adversarial context alter approaches to cybersecurity? 
  • What, if any, responsibility do cybersecurity professionals have for shaping or reshaping perceptions of system integrity and in reducing the gap between reality and perception?

A single, verifiable reality is foundational to assuring the integrity of systems and the data they house. The assault on truth challenges this assumption and forces attention on both actual and perceived system security. In the current climate, the question is not if this assault will continue. Rather, it is how will cybersecurity professionals respond.

Endnotes 

1 Merriam-Webster, “Truth
2 US Cybersecurity and Infrastructure Agency, “Joint Statement From Elections Infrastructure Government Coordinating Council & the Election Infrastructure Sector Coordinating Executive Committees,” 12 November 2020, USA

Diana L. Burley, Ph.D.

Is vice provost for research at American University (AU) (Washington DC, USA), where she is also professor of public administration and policy and professor of IT and analytics. Named one of SC Magazine’s 8 Women in IT Security to Watch in 2017 and a Woman of Influence by the Executive Women’s Forum, she regularly conducts cybersecurity training for executives across Asia, Europe, the Middle East and North America. She is a member of the US National Academies and Science, Engineering and Mathematics Board on Human-Systems Integration and an affiliated researcher with the Johns Hopkins University Applied Physics Laboratory (Baltimore, Maryland, USA). Burley previously directed the Institute for Information Infrastructure Protection (I3P) at the George Washington University (Washington DC, USA) and led the CyberCorps program for the US federal government.