Designing Ethical Systems By Auditing Ethics

Author: Josh P. Scarpino, D.SC., CISM
Date Published: 3 July 2023
français

Ensuring that any emerging technology that is implemented within an organization honors inalienable human rights is nonnegotiable. The question remains, how can ethics be audited when they involve so many perspectives that can be influenced by different cultural viewpoints? Ethics are generally defined as sets of beliefs about what are considered acceptable and unacceptable behaviors within a society.1 But the opinions of individuals from one region in the world may be vastly different from those of another region. Furthermore, asking organizations to perform audits of their technology implementations through an ethical lens seems to be contrary to what drives an organization’s bottom line and stakeholder value—especially when organizations are asked to document the results. When organizations fail to embed ethics as a foundational requirement within their development processes and are found to be violating foundational ethical beliefs engrained within society, they can face reputational or financial impact, potential litigation or regulatory consequences.2 This can lead to a direct impact on customer and employee trust in the organization and may offset any potential benefits received by circumventing processes. It is critical that organizations understand and embed ethics when developing and implementing emerging technologies, thereby validating their responsible technology processes throughout the life cycle. Furthermore, they must audit the implemented processes to ensure that ethical values are embedded throughout.

The Ethical Dilemma

Although there is generally a consensus on what norms are accepted in society, there is not one specific ethical perspective that serves as the standard for how organizations should act or define their ethical or responsible technology programs. These foundational perspectives inform and determine the ethical lens used when developing technologies. Opinions about what is ethical can shift significantly between communities within society.3 Ethics can also evolve, making it critical that a new process be established to address evolving perspectives.

It has been noted that “Culture has a significant impact when determining the foundational values to be leveraged when technology systems are deployed.”4 In addition, business drivers impact ethical decisions and require organizations to make tradeoffs between accuracy, fairness and ethics as perceived by society.5 Unless an organization is firmly grounded in its ethics program, ethics only becomes a priority when organizations are at risk of becoming a case study or facing regulatory impacts.6 Organizations must do better to ensure that technology honors inalienable human rights and does not cause an unnecessary disparate impact on members within society.

When it comes to emerging technology, there is no clear alignment or standard approach to embedding ethics.

The Institute of Electrical and Electronics Engineers (IEEE) Global Initiative on Ethics of Autonomous and Intelligent Systems states:

Whether our ethical practices are Western (Aristotelian, Kantian), Eastern (Shinto, Confucian), African (Ubuntu), or from a different tradition, by creating autonomous and intelligent systems that explicitly honor inalienable human rights and the beneficial values of their users, we can prioritize the increase of human well-being as our metric for progress in the algorithmic age.7

Varied Approaches to Addressing Ethical Concerns

When it comes to emerging technology, there is no clear alignment or standard approach to embedding ethics. Organizations are focused on delivering products to market and may deprioritize conversations that could delay time to value. Instead, organizations must align with social and moral norms to guide their approach to ethical technology development.8 However, there are several approaches to managing ethics for emerging technologies. One option is to ignore ethics to avoid stifling innovation, with the ultimate goal that the outcomes outweigh any negative consequences. Another approach is to address ethical concerns as they are realized. The last approach is to attempt to predict ethical challenges and understand how they can impact emerging technology.

One author’s proposed approach aimed at predicting ethical challenges is anticipatory technology ethics.9 This approach considers three levels of ethical analysis: the technology, artifact and application. At the technology level, the ethical review focuses on the pieces of the technology and considers the ethical issues associated with the components. At the artifact level, the ethical analysis focuses on the components that result or could result from a specific technology presenting a moral issue. At that level, results are unavoidable by nature, due to potential applications or simply due to the inherent risk of the artifact or the need for ethical justification. At the application level, the focus is placed on how the artifacts are used, the procedure, or its potential configuration. Although the process of defining a shared view of ethics and how it applies to emerging technologies is difficult, it is necessary to find a path forward and embed ethics into all development processes.10 One way to ensure that ethical values are continuously evolved and embedded is by ensuring that developed processes align with an organization’s code of conduct and ethics.

Auditing the Process, Not the Application

Ethics audits ensure an organization’s behaviors align with its code of conduct and ethics.11 A code of ethics is a set of rules dictating what is considered acceptable and unacceptable behavior within an organization. The US Securities and Exchange Commission (SEC) defines a code of conduct as being leveraged to deter wrongdoing and promote, among other things, honest and ethical conduct.12

An ethics audit includes analyzing the organization’s mission, vision, value statements, code of ethics and supporting documentation. The interview or completion of questionnaires by board members, staff and volunteers help identify any gaps. The creation of a report and follow-up activities may include the development of an educational program and monitoring to address gaps.13, 14

However, auditing ethics is challenging because views can change based on the individual beliefs and the societal backgrounds of those involved in the process. In addition, organizations have different moral frameworks, which are sets of reasonable and coherent moral beliefs and principles that distinguish a group of people or a culture, or common and accepted values in society.15 Moral frameworks in organizations are the result of organizations positioning themselves within the industry and defining core values and unique employee perspectives.

Although there is societal disagreement on many common ethical challenges, organizations cannot leverage just one moral framework to audit or ensure the compliance of emerging technologies. Instead, they must look to understand the process that addresses ethics and ensure that the appropriate controls have been implemented. It is critical to ensure that ethical evaluations and checkpoints are a fundamental component of the technology development process.

As a foundational requirement and to promote transparency and trust in results, organizations should have their programs assessed by an independent auditor after adoption.

Understanding and examining the impact of emerging technologies, the cultural values of who is developing the system, and for whom the system is deployed is a core component that must be considered when ensuring ethical technology deployments.16 Taking into consideration why the system was designed and who it will impact should be at the forefront for all organizations reviewing their deployments. In addition, as these systems are deployed, ethical reviews should be performed and inalienable human rights explicitly honored.17 Deploying technologies that have a disparate impact on individuals or portions of society should cause organizations to pause and evaluate the true purpose and need of their technology. To appropriately evaluate the technology, organizations must maintain diverse and varied viewpoints when implementing artificial intelligence (AI) and machine learning (ML) and ensure that all potential risk factors are analyzed throughout the deployment life cycle.18 Including stakeholders from different backgrounds who offer varied cultural perspectives as part of the designing, testing and monitoring processes is critical to ensure responsible technology. Individuals must be responsible for confirming that emerging technology deployment follows an ethical development life cycle, which includes ensuring that ethical implications are assessed before implementation and validated after deployment.19

Ethical Audit Considerations for Emerging Technologies

As a foundational requirement and to promote transparency and trust in results, organizations should have their programs assessed by an independent auditor after adoption. This helps remove the potential for bias and increases public confidence in the process. Some items that should be included in an ethical process audit include:

  • Ensuring that ethical audits are part of an organization’s standard process and following a standard audit format, starting with an analysis of artifacts within the organization. Analysis of these documents facilitates an understanding of how the organization approaches ethical decisions, the practices for managing ethics and what should be the expected outcome.
  • Assessing the individuals involved in the ethical review process to determine whether they come from varied backgrounds, education levels and offer unique perspectives. Does the review team consist of individuals with varied perspectives to include different ethical lenses such as the rights lens, the justice lens, the utilitarian lens, the common good lens, the virtue lens and the care ethics lens?20 The rights lens suggests that ethical action is the best and protects the rights of those affected. The justice lens takes the perspective that each individual should be afforded fair or equal treatment. The utilitarian lens aims to understand how this will impact everyone involved and the consequences. The common good lens aims to consider that life in community is good and actions should contribute to this, highlighting concern for all members of a community. The virtue lens aims to ensure that ethical actions align with the ideal virtues for humanity. The care ethics lens aims to understand the needs of and impact on each individual and their specific circumstances.
  • Ensuring that the individuals who are part of the ethical process understand the technology deployment they are analyzing, including the purpose and intended outcomes, and who is impacted by the system. Understanding who will be impacted by the system can change the ethical perspective. For example, deployment of a system that targets adults has a completely different set of requirements than a system that targets minors.
  • Determining if the ethical process includes documented artifacts for the ethical review, decisions and outcomes. Documentation and tracking of the ethical review processes are critical to ensuring accountability and promoting transparency in the process.
  • Monitoring the outcomes of the systems analyzed to ensure that they do not deviate from the anticipated outcomes. Failure to validate and monitor for compliance with expected outcomes can result in the potential for nondetected impacts.
  • Ensuring that a documented feedback loop is present so that lessons learned and external feedback are reingested into the ethical process.
When organizations leverage audit to validate processes that can impact the rapidly expanding emerging technology field, they can ensure that risk and ethical concerns are evaluated earlier in the process.

Conclusion

Ensuring that technologies are designed and deployed ethically can bring value to organizations, drive profit and prioritize inalienable human rights. It is critical that organizations remember that “there are trends in the industry that need to be addressed, favoring operational goals and efficiency over ethics will no longer be acceptable.”21 Although audits can only provide a snapshot of an organization’s current approach to its ethics process, they can provide the external validation for a well-defined and implemented ethics or responsible technology program. Having a firm understanding of the ethical issues present can help stakeholders reflect on the potential technology outcomes and the deviation from the expected trajectory. Organizations must understand the context of a system’s use and how ethical issues could evolve so that society can trust how these systems are deployed.22 Senior leadership is responsible for setting the tone for how ethics are addressed and handled within the organization.

Promoting transparency in ethical processes, establishing standard practices, and ensuring appropriate oversight and stakeholder involvement are critical. Auditing has always been a crucial step to ensuring that organizations adhere to their core processes and regulatory requirements. When organizations leverage audit to validate processes that can impact the rapidly expanding emerging technology field, they can ensure that risk and ethical concerns are evaluated earlier in the process. These audits can reveal how well or poorly organizations are aligned to expected standards and ensure that society has confidence in the organization’s solutions and ethical approach. When organizations perform audits that review their ethical processes and impacted systems, they can provide assurance that they have embedded audit findings into how they do business. Furthermore, this validates the importance that the organization has placed on the development of responsible technology. Organizations must audit their responsible technology programs to ensure consistency and alignment with their ethical values to validate that these foundational processes exist and ensure that there is not a disparate impact on individuals or society.

Endnotes

1 Scarpino, J. P.; “An Exploratory Study: Implications of Machine Learning and Artificial Intelligence in Risk Management,” Marymount University, Arlington, Virginia, USA, 2022
2 Cheatham, B.; K. Javanmardian; H. Samandari; “Confronting the Risks of Artificial Intelligence,” McKinsey Quarterly, 26 April 2019, https://www.mckinsey.com/capabilities/quantumblack/our-insights/confronting-the-risks-of-artificial-intelligence
3 Chaput, R.; J. Duval; O. Boissier; M. Guillermin; S. Hassas; “A Multi-Agent Approach to Combining Reasoning and Learning for an Ethical Behavior,” Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society, July 2021, https://doi.org/10.1145/3461702.3462515
4 Op cit Scarpino
5 Lo Piano, S.; “Ethical Principles in Machine Learning and Artificial Intelligence: Cases From the Field and Possible Ways Forward,” Humanities and Social Sciences Communications, vol. 7, iss. 1, 2020, https://doi.org/10.1057/s41599-020-0501-9
6 Op cit Scarpino
7 Shahriari, K.; M. Shahriari; “Ethically Aligned Design: A Vision for Prioritizing Human Well-Being With Autonomous and Intelligent Systems (A/ IS),” 2017 IEEE Canada International Humanitarian Technology Conference (IHTC), Canada, 2017, p. 197–201, https://ieeexplore.ieee.org/document/8058187
8 Trunk, A.; H. Birkel; E. Hartmann; “On the Current State of Combining Human and Artificial Intelligence for Strategic Organizational Decision Making,” Business Research, vol. 13, iss. 3, 2020, p. 875–919, https://doi.org/10.1007/s40685-020-00133-x
9 Brey, P.; “Anticipatory Technology Ethics for Emerging IT,” CEPE 2011: Crossing Boundaries, University of Wisconsin Milwaukee, USA, 2011, p. 13–26.
10 Manyika, J.; J. Silberg; B. Presten; “What Do We Do About the Biases in AI?” Harvard Business Review, 25 October 2019, https://hbr.org/2019/10/what-do-we-do-about-the-biases-in-ai
11 Krell, E.; “How to Conduct an Ethics Audit,” HR Magazine, vol. 55, iss. 4, 2010, p. 48–51, https://www.proquest.com/trade-journals/how-conduct-ethics-audit/docview/205070994/se-2
12 US Securities and Exchange Commission, “Code of Business Conduct and Ethics,” https://www.sec.gov/Archives/edgar/data/1094007/000119312504044901/dex14.htm
13 Allen, M. B.; “The Ethics Audit,” Nonprofit World, vol. 13, 1995, p. 51, https://www.proquest.com/magazines/ethics-audit/docview/221335917/se-2
14 Hofmann, P. B.; “Performing an Ethics Audit,” Healthcare Executive, vol. 10, iss. 6, 1995, p. 47, https://www.proquest.com/trade-journals/performing-ethics-audit/docview/200314221/se-2
15 Frederick, R.; A Companion to Business Ethics, Wiley-Blackwell, USA, 1999
16 Op cit Scarpino
17 Ibid.
18 Ibid.
19 Scarpino, J.; “Evaluating Ethical Challenges in AI and ML,” ISACA® Journal, vol. 4, 2022, p. 27–33, https://www.isaca.org/archives
20 Markkula Center for Applied Ethics at Sanata Clara University, “A Framework for Ethical Decision Making,” USA, 2021, https://www.scu.edu/ethics/ethics-resources/a-framework-for-ethical-decision-making/
21 Op cit Scarpino, ISACA Journal, 2022
22 Stahl, B. C.; J. Timmermans; C. Flick; “Ethics of Emerging Information and Communication Technologies: On the Implementation of Responsible Research and Innovation,” Science and Public Policy, vol. 44, iss. 3, June 2017, p. 369–381, https://doi.org/10.1093/scipol/scw069

JOSH P. SCARPINO | D.SC., CISM

Is the vice president of information security at TrustEngine, where he leads IT operations, security and compliance programs. He is also responsible for developing and managing the organization’s responsible technology program. Scarpino has more than 18 years of IT and security experience in the US Department of Defense (DoD). He has led security operations for Fortune 500 companies, enhanced critical controls at financial and manufacturing organizations, and led, scaled and audited security and compliance programs. He has spent his career bridging areas across technology and security to include operations, governance, risk and compliance (GRC), and responsible technology. Scarpino also has a passion for educating the next generation of professionals. Currently, he is partnering with ForHumanity to develop frameworks for auditing artificial intelligence (AI) systems and is continuing his research of ethical AI independently and in partnership with the University of Pittsburg CAIR Lab (Pennsylvania, USA).