Whose Data Is It Anyway?

Author: Guy Pearce, CGEIT, and Sandra Ketchen, President and CEO, Spectrum Health Care
Date Published: 28 February 2020
Related: Maintaining Data Protection and Privacy Beyond GDPR Implementation | Digital | English
español | 中文

Beyond the privacy concerns expressed about organizations such as Facebook and Google,1 the next wave of data privacy issues is fueling the white-hot inferno that resulted in the globally significant and influential regulation of the EU General Data Protection Regulation (GDPR), which states that the human right to privacy is fundamental.2

The following list will be used as a basis of the data ownership conversation. While this list was originally used from a healthcare perspective,3 these characteristics resonate across sectors. In particular, the first four concern current topics in security and privacy, while the last is an interesting point relating to data economics:

  • Privacy
  • Informed consent
  • Control of the data
  • Secure storage, access and transfer
  • Benefits accruing both to the system and the patient (economics)

The Matter of Consent and Humans as Data Subjects

Humans look to learn from history, although people are not very good students. Barely 40 years ago, the Belmont Report was published by the US National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. It concerned itself with establishing ethical principles and guidelines for the protection of human subjects of research, prompted by the lack of ethics exhibited by physicians in the 40-year Tuskegee Syphilis Experiment (TSE), which started in 1932.4

The TSE was halted in 1972 after public outrage and, in 1997, US President Bill Clinton issued an apology for the scandal.5 The Belmont report sought to present findings from the experiment to ensure future respect for the human subjects of biomedical and behavioral research, that no harm is done, and that procedures administered are reasonable and non-exploitative. It also proposed informed consent, an assessment of risk and benefits of similar programs in the future, and a moral selection of subjects.6

Many more important guiding principles have since been published, such as the Nuremburg code after World War II7 or the Declaration of Helsinki of 1975,8 all highlighting the need for respect and dignity when considering human beings as data subjects. These guiding principles all sought to ensure that human beings are treated with respect, that informed consent is sought when collecting personal data, that the risk scenarios are understood and that the data processing is non-exploitative.

No Permission
Today, more than ever, humans are desirable data subjects. They may be less subject to explicit experimentation, but they remain data subjects without their knowledge. The Facebook and Cambridge Analytica scandal—using Facebook data to influence behavior for political ends—is still fresh in the minds of most people. In particular, and in the context of the information users volunteer to social media, “…our online behavior exposes a lot about our personality, fears and weaknesses—and…this information can be used for influencing our behavior.”9

Because organizations invested in the collection and storage of data, some propose that customer data belongs to the organization.10 This is possibly less that an organization stores individual data— organizations need this to be able to serve their customers or for regulatory purposes (e.g., customer due diligence in banking)—but rather what organizations do with data beyond any requirements for basic operations or regulatory reporting, and without the customer’s knowledge. This is what GDPR’s informed consent is about: ensuring that subjects are informed about what an organization does with their data and giving them the choice about agreeing to how the organization is processing their data. Further, GDPR provides for the subject to opt out of such processing.11

While many may appreciate the convenience offered by service providers given their data sharing, others may prefer to opt out, not wanting marketing technology organizations (figure 1) processing their data, where even “[a] single site could easily have 20 ad-targeting partners, often invisible to the person whose data is being shared.”12 There are many enterprises that buy and use personal data, with potentially thousands of data processors and brokers (individually almost indiscernible in figure 1) dipping “…into the data streams of our lives…”13 to create and sell personalized profiles about customers without their permission or even knowing about it, including increasingly invasive systems for probing, monitoring and tracking customers.14 As demonstrated in GDPR, obtaining explicit consent respects privacy as a basic human right, giving humans a choice by allowing those who want to share their data to do so, and those who do not to have their data protected.

Adding to the invisibility of the data sharing issue raised previously, in the Cambridge Analytica case, Facebook users were not aware that their data were being used to influence their behavior for political objectives that influenced the course of history, not only in the United States, but also in Europe and in countries including Brazil, India, Kenya, Malaysia, Mexico and Nigeria,15 40 years after lessons should have been learned from experiments such as the TSE and subsequent reports such as the Belmont Report.

The issue has grown beyond enterprises such as Facebook offering advertisers user data including personal interests, job titles, ethnic identities, credit card spending habits, purchasing habits and financial information, to some of the marketing organizations in figure 1. This data sharing has led to fusing data from different sources to “…create hyper-targeted, manipulative, and sometimes discriminatory and invasive ad campaigns.”16


View Large Graphic

For some, the use of personal data in this way is acceptable. For others, it is not, giving rise to search engines, such as DuckDuckGo which claims to protect searcher’s privacy and that delivers unfiltered search results to its users. Even antivirus software such as BitDefender has optional antitracking features that prevent tracking software from accessing a user’s data.

OBTAINING EXPLICIT CONSENT RESPECTS PRIVACY AS A BASIC HUMAN RIGHT, GIVING HUMANS A CHOICE BY ALLOWING THOSE WHO WANT TO SHARE THEIR DATA TO DO SO, AND THOSE WHO DO NOT TO HAVE THEIR DATA PROTECTED.

Unpaid Factors of Production and Concentrations of Power
Through an economics lens, data would be classified as a factor of production. Common with mainstream economics, entrepreneurship (e.g., Cambridge Analytica), capital (e.g., hardware, software) and labor (e.g., data scientists) are factors of production but, unlike mainstream economics, there is no payment to the providers of data—individuals—as a factor of production (figure 2), although there is payment to data-aggregating organizations such as Facebook or its intermediaries. In this way, “[d]ata appropriation is a form of exploitation because companies use data to create value without providing people with comparable compensation.”17

In economic theory, all factors of production are paid. In today’s social media age, human beings as providers of data are not a paid factor of production, and they often do not even know their data are being used for power and profit. Furthermore, Facebook experienced a data breach of 540 million records,18 one of the largest breaches of data to date,19 exposing those impacted to significant risk. The nature of the breaches seen in recent years makes one wonder how much effort is applied to protect the personal data of so many millions of human beings, when human data are treated as a commodity for which costs must be minimized. Indeed, the higher the organizational cost leadership pressure—the need to achieve the lowest operating costs relative to competitors to be competitive—the more difficult it becomes to motivate an appropriate budget for data security.

 

In this context, it is not only the growing wealth gap at play here due to missing payments to a factor of production, it is the subsequent concentration of power that emerges. Indeed, the sheer scale of the tech giants has been the cause of slow economic growth and wage stagnation20 by undermining the ability of traditional organizations to innovate21 and compete.22 The concentration of power has created a dynamic of “… racing toward a technology oligarchy which will define our personal and professional lives.”23 Does anyone care?

IN TODAY’S SOCIAL MEDIA AGE, HUMAN BEINGS AS PROVIDERS OF DATA ARE NOT A PAID FACTOR OF PRODUCTION, AND THEY OFTEN DO NOT EVEN KNOW THEIR DATA ARE BEING USED FOR POWER AND PROFIT.

No Ethics
Facebook’s Chief Executive Officer (CEO) Mark Zuckerberg said that he had made a “huge mistake in failing to take a broad enough view of what Facebook’s responsibility is in the world.”24

In summary, social responsibility is defined as:

An ethical theory, in which individuals are accountable for fulfilling their civic duty…[for]…a balance between economic growth and the welfare of society…25

To what extent is Facebook ethically balancing economic growth with the welfare of society?

To help consider this, there are four ethical principles,26 each of which can be used as a basis to consider the ethical standards of organizations that sell and otherwise leverage human data. These are:

  1. Respect for autonomy—Ensuring human dignity
  2. Beneficence—Bringing about good through actions
  3. Nonmaleficence—Doing no harm
  4. Justice—Ensuring impartiality and fairness

On points 1 and 2, how is human dignity served, and how can it be claimed that good is being done when humans are coerced to act in artificial ways that serve only a few (e.g., Cambridge Analytica)? On point 3, how can anyone claim that no harm is being done when data security may lack the relevant due diligence in some organizations, potentially impacting the lives of millions of people through financial crime and identity theft? On point 4, as a 21st century factor of production, how are the individual human contributors of the raw materials of today’s data organizations given recognition for their contribution to corporate economic valuations that are, quite simply, startling? Today, data-rich organizations such as Facebook, Apple, Amazon and Alphabet (Google), are among the top 10 wealthiest companies in the world.27

Ethical issues emerge whenever technology (hardware and software) and data are involved,28 which begs the question, “Whose data is it anyway?” Does data truly belong to the organizations that invest in the infrastructure to store and process them, as a free factor of production in neoclassical economics? Or, ethically, should personal data remain the property of the person who can then make suitable decisions about the use of the data in question?29 Ethics as a subject is growing in importance in IT because technology gives people more power to act, requiring them to make choices that they did not have to make previously.30 That, in itself, may be problematic.

It is clear that individuals are not in control of their data per the characteristics of data ownership highlighted in the second paragraph of this article. This state is set to become significantly more complex when highly valued data, such as healthcare data, become mainstream.

The Next Frontier of Data Ownership: Health and Wellness Data

Google has been diversifying into healthcare under the Google-owned company Verily.31 Between 2013 and 2017, Google executed 75 ventures and deals spanning healthcare and digital health, with its primary focus being to use artificial intelligence (AI) to mine data that will allow Google to detect disease and manage personal health and wellness. Think of Google-owned companies identifying and managing eye disease, diabetes, heart disease, Parkinson’s disease or multiple sclerosis. These are all active projects and investments within the Google family, the success of which will depend on the collection and assessment of an individual’s personal health and wellness data.

Like Google, Apple is also committing resources to healthcare data strategy.32 In the United States alone, there were more than 85 million iPhone owners older than 13 years as of December 2017.33 Apple launched ResearchKit in 2015, a platform disintermediating how research participants are recruited and how they participate in studies. Apple acquired Gliimpse in 2016 to manage wellness and personal vital signs, and HealthKit was launched in 2018, which connects an electronic medical record into the platform. Approximately 210 organizations are now beta testing this product.

Third-party organizations are building devices or applications (apps) to partner with Apple’s platform. Butterfly Network is building portable ultrasound devices that plug into a mobile phone, AliveCor has developed a six-lead electrocardiogram, and CellScope is bringing an otoscope to market. With all these connected devices and platforms, the consumer can possess a complete picture of their health on their mobile phone with all the privacy that may or may not exist on that platform.

DOES DATA TRULY BELONG TO THE ORGANIZATIONS THAT INVEST IN THE INFRASTRUCTURE TO STORE AND PROCESS THEM, AS A FREE FACTOR OF PRODUCTION IN NEOCLASSICAL ECONOMICS?

Interestingly, more than 10 years have passed since a landmark paper that found only 10 percent of health outcomes were impacted by actual healthcare. Genetics makes up 30 percent of outcomes, but the remaining 60 percent of individual health outcomes were deemed to be impacted by behavior and environment.34 Since then, the world has changed in the sheer volume of data being generated by social media feeds, the Internet of Things (IoT) devices and digitization of records, increasingly highlighting individual behavioral traits. A 2019 assessment of big data in US healthcare references the vast number of records being generated daily, soon estimated to grow to a zettabyte (a trillion gigabytes).35

People increasingly use their mobile phones to manage their lives, including food and alcohol purchases, shopping, and they use wearables for tracking exercise, sleep, activity and more. Applying AI to this personal data while recalling that health outcomes are predominantly affected by behavior and environment means that one’s phone has a good idea of an individual’s health status even before integrating it with an actual electronic health record. In this way, Instagram has been found to be more effective in identifying depression compared to doctors or other clinicians.36

Vast amounts of personal information are collected and shared daily but, as consumers or patients, do individuals understand the choices and risk associated with management of their health data? There is ongoing debate around ownership of medical records from sources of acute or primary care. However, fee-for-service platforms such as Dot Health provide consolidated personal records on mobile phones, so does it matter if a hospital or clinic retains “ownership”? Again, do people know what happens to their data regardless of who claims ownership?

THERE IS ONGOING DEBATE AROUND OWNERSHIP OF MEDICAL RECORDS FROM SOURCES OF ACUTE OR PRIMARY CARE.

There are many points of connection to the healthcare system where personal data are created but continues to be owned, managed and leveraged by the service provider, often without explicit consent.

Every time you shuffle through a line at the pharmacy, every time you try to get comfortable in those awkward doctor’s office chairs, every time you scroll through the web while you’re put on hold with a question about your medical bill, take a second to think about the person ahead of you and behind you. Chances are, at least one of you is being monitored by a third party like data analytics giant Optum, which is owned by UnitedHealth Group, Inc. Since 1993, it has captured medical data—lab results, diagnoses, prescriptions, and more—from 150 million Americans. That’s almost half of the US population.37

Many of these providers then sell their data to enterprises that will use it for anything from drug development to insurance pricing or customized advertising.

How many times do people engage with social media platforms and share something that happened about their day, what they ate, the success of a workout or a fundraiser for cancer? While maybe not thinking through the consequences, people are increasingly comfortable with sharing personal health data, from both traditional and nontraditional sources.38 While only 8 to 12 percent of survey respondents reported they would actually share health data with a technology company, 60 percent of the survey respondents would share this data with Google and 50 percent of the survey respondents would share personal health data with Microsoft.39 Just like the marketing third parties processing social media data (figure 1), an assessment of the top-rated medical apps for Android from Google Play found that a staggering 79 percent of apps shared user health data with third-party organizations, and entities from 46 organizations used or consumed this data in some fashion.40

Calls to Action

Individuals are both passive and active creators of data about themselves. Do people care enough about who uses their data, what they do with them, who sees them and whether a third party makes money off them? Today’s data scope includes relatively simple data such as demographics to more dynamic things such as an individual’s mobility and financial data to social media and interaction with multiple organizations in the global digital village. The next frontier—healthcare data—includes fitness levels, the kinds of activity in which people participate, their blood test results, and even whether they suffer from chronic illness or a specific disease.

If individuals care about the use of their data, then GDPR has matured the matter of privacy as a fundamental human right, an evolution of possibly the first privacy law in England in 1361 offering protection against Peeping Toms and eavesdroppers.41 If everyone agrees that privacy is a basic human right, then everyone certainly has a collective duty and a responsibility to defend that right by means of various forums globally. If everyone does not, then it would be hypocritical to produce an uproar about that next massive data breach from a favorite retailer, financial services organization, credit agency or health provider, or to be critical of organizations that make money off of personal data.

If privacy is not a concern, there might be concerns about the increasing concentration of power among the data oligopolies. If people do not like this, traditional economics teaches that people must vote with their wallets. However, individuals do not have this pleasure in this digital rendition of commerce because, as data subjects, individuals are those wallets. If people cannot vote with their wallets, organizations that espouse privacy by design, security by design and good ethics should win hearts as alternatives. Indeed, there are arguments that such enterprises can be both ethical and even more profitable than unethical enterprises.42

In addition to taking responsibility for personal data and how individuals share it, for those uncomfortable with the state of things, the call to action is to respond to the responsibility associated with the basic human right to privacy and to take actions such as seeking out the services one wants from alternative, potentially more ethical organizations. In so doing, not only are those organizations supported and, thus, increasingly viable, it also serves to dilute the oligopolies. However, how many will be able to support enterprises that espouse privacy, security and ethics when the allure of what the oligopolies are able to do with personal data—even if it is without permission—shines so brightly? For some, giving up that shine for something functionally similar, but often much less polished, might be too high a price.

Conclusion

With healthcare data as the next frontier of data ownership, a 2019 study on “actionable ethics in digital health research”—reminiscent of the Belmont Report—identified a gap in ethical considerations around human data subjects, this time concerning the spread of big data beyond research use.43 In fact, consumers and patients have been urged “…to become more aware of the necessity for their data to be protected while making use of the benefits of data exchange and sharing.”44

IF EVERYONE AGREES THAT PRIVACY IS A BASIC HUMAN RIGHT, THEN EVERYONE CERTAINLY HAS A COLLECTIVE DUTY AND A RESPONSIBILITY TO DEFEND THAT RIGHT BY MEANS OF VARIOUS FORUMS GLOBALLY.

Perhaps due to a comparative lack of political will to prioritize appropriate policy development, a lack of a regulatory framework in other parts of the world akin to GDPR in Europe is the single most important challenge in personal health data management.45

Ultimately, the greater the procrastination, the harder it will be to reverse the consequences, especially in a future world of AI and robots. Certainly individuals should not expect the current and future data giants to intervene; with the siren’s call to a concentration of power and profit, it would be a conflict of interest, ironically, as much as their very employees and perhaps even they themselves are the subject of all this. With lackluster political will and an apparent corporate conflict of interest, it comes down to personal behavior, if people can resist the inexorable draw to the bright and shiny.

For that next frontier, researchers have started to reference all of the data that falls outside the scope of the US Health Insurance Portability and Accountability Act (HIPPA) or similar legislation as shadow health records.46 With so many third parties already accessing and processing so much personal data, in the case of even more sensitive health data, do individuals know where their shadows are being cast? Rephrasing David deBronkhart’s imperative, ultimately everyone should remember that the data really belong to the owner.

Endnotes

1 Weinberg, G.; “Google and Facebook Are Watching Our Every Move Online. It’s Time to Make Them Stop,” CNBC Tech, 31 January 2018, https://www.cnbc.com/2018/01/31/google-facebook-data-privacy-concerns-out-of-control-commentary.html
2 Perugini, M. R.; “Why GDPR?” Europrivacy, 21 April 2016, https://europrivacy.info/2016/04/21/why-gdpr/
3 Babaian, J.; “Healthcare Data Access, Ownership, and Does It Matter?” HCLDR Healthcare Leadership Blog, 25 April 2017, https://hcldr.wordpress.com/2017/04/25/healthcare-data-access-ownership-and-does-it-matter/
4 Brandt, A. M.; “Racism and Research: The Case of the Tuskegee Syphilis Study,” The Hastings Center Report, vol. 8, iss. 6, 1978, p. 21–29.
5 Nix, E.; “Tuskegee Experiment: The Infamous Syphilis Study,” History, 29 July 2019, https://www.history.com/news/the-infamous-40-year-tuskegee-study
6 US Department of Health, Education and Welfare, Ethical Principles and Guidelines for the Protection of Human Subjects of Research, The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research, 18 April 1979, https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/read-the-belmont-report/index.html
7 Mitscherlich, A.; F. Mielke; Doctors of Infamy: The Story of the Nazi Medical Crimes, Schuman, USA, 1949: xxiii–xxv, http://www.cirp.org/library/ethics/nuremberg/
8 World Medical Association, “WMA Declaration of Helsinki—Ethical Principles for Medical Research Involving Human Subjects,” 9 July 2018, https://www.wma.net/policies-post/wma-declaration-of-helsinki-ethical-principles-for-medical-research-involving-human-subjects/
9 Golbeck, J.; S. Aral; “Why the Cambridge Analytica Scandal Is a Watershed Moment for Social Media,” Knowledge@Wharton, Wharton School of the University of Pennsylvania, Philadelphia, USA, 22 March 2018, https://knowledge.wharton.upenn.edu/article/fallout-cambridge-analytica/
10 Wolpe, T.; “Data Privacy: You May Call It Personal Data But Who Actually Owns It?” ZDNet, 11 June 2015, https://www.zdnet.com/article/data-privacy-you-may-call-it-personal-data-but-who-actually-owns-it/
11 Intersoft Consulting, GDPR Consent, https://gdpr-info.eu/issues/consent/
12 Brandom, R.; “Everything You Need to Know About GDPR,” The Verge, 25 May 2018, https://www.theverge.com/2018/3/28/17172548/gdpr-compliance-requirements-privacy-notice
13 Sadowski, J.; “Companies Are Making Money From Our Personal Data—But at What Cost?” The Guardian, 31 August 2016, https://www.theguardian.com/technology/2016/aug/31/personal-data-corporate-use-google-amazon
14 Ibid.
15 BBC News, “Cambridge Analytica: The Data Firm’s Global Influence,” 22 March 2018, https://www.bbc.com/news/world-43476762
16 McLeod, B.; “What the Facebook Ad Targeting Changes Mean for Your Social Media Strategy,” Blue Corona, 29 March 2018, https://www.bluecorona.com/blog/facebook-ad-targeting-changes
17 Op cit Sadowski
18 Silverstein, J.; “Hundreds of Millions of Facebook User Records Were Exposed on Amazon Cloud Server,” CBS News, 4 April 2019, https://www.cbsnews.com/news/millions-facebook-user-records-exposed-amazon-cloud-server/
19 Kiesnoski, K.; “Five of the Biggest Data Breaches Ever,” CNBC, 30 July 2018, https://www.cnbc.com/2019/07/30/five-of-the-biggest-data-breaches-ever.html
20 Wolverton, T.; “Facebook, Google, Apple, and Amazon Have Too Much Power—So It’s Time for Regulators to Take on Tech’s Titans,” Business Insider, 8 April 2018, https://www.businessinsider.com/antitrust-regulators-need-to-curve-the-power-of-facebook-et-al-2018-4
21 Kerner, L.; “The Profound Implications of Five Increasingly Dominant Tech Companies,” Crypto Oracle, Medium, 9 April 2017, https://medium.com/crypto-oracle/facebook-apple-microsoft-google-amazon-aka-famga-is-eating-the-world-d3ba0c62df8b
22 Baca, M. C.; C. Zakrzewski; “Lawmakers Grill Amazon, Facebook, Google and Apple at Antitrust Hearing,” The Washington Post, 16 July 2019, https://www.washingtonpost.com/technology/2019/07/16/lawmakers-grill-amazon-facebook-google-apple-antitrust-hearing/
23 Andriole, S.; “Apple, Google, Microsoft, Amazon and Facebook Own Huge Market Shares = Technology Oligarchy,” Forbes, 26 September 2018, https://www.forbes.com/sites/steveandriole/2018/09/26/apple-google-microsoft-amazon-and-facebook-own-huge-market-shares-technology-oligarchy/#60e6f9882318
24 Deutsche Welle, “Facebook Admits to Far Higher Number of Data Breaches,” DW.com, 4 May 2018, https://www.dw.com/en/facebook-admits-to-far-higher-number-of-data-breaches/a-43258301
25 Pachamama Alliance, “Social Responsibility and Ethics,” https://www.pachamama.org/social-justice/social-responsibility-and-ethics
26 Gracyk, T.; “Four Fundamental Ethical Principles (A Very Simple Introduction),” Minnesota State University Moorhead, USA, 3 February 2012
27 Miller, R.; “The Top 100 Best-Performing Companies in the World, 2019,” CEOWORLD Magazine, 28 June 2019, https://ceoworld.biz/2019/06/28/the-top-100-best-performing-companies-in-the-world-2019/
28 Pearce, G.; “Acknowledging Humanity in the Governance of Emerging Technology and Digital Transformation,” ISACA® Journal, vol. 4, 2019, https://www.isaca.org/archives
29 Op cit Babaian
30 Op cit Pearce
31 CBInsights, “How Google Plans to Use AI to Reinvent the $3 Trillion US Healthcare Industry,” https://www.cbinsights.com/research/report/google-strategy-healthcare/
32 Forrester, “Apple Puts the Promise of Health Innovation in the Hands of Consumers… Pun Intended,” Forbes, 17 September 2019, https://www.forbes.com/sites/forrester/2019/09/17/apple-puts-the-promise-of-health-innovation-in-the-hands-of-consumers-pun-intended/#2ad322171c24
33 CBInsights, “Apple Is Going After the Healthcare Industry, Starting With Personal Health Data,” 8 January 2019, https://www.cbinsights.com/research/apple-healthcare-strategy-apps/
34 Artiga, S.; E. Hinton; “Beyond Health Care: The Role of Social Determinants in Promoting Health and Health Equity,” KFF, 10 May 2018, https://www.kff.org/disparities-policy/issue-brief/beyond-health-care-the-role-of-social-determinants-in-promoting-health-and-health-equity/
35 Faggella, D.; “Where Healthcare’s Big Data Actually Comes From,” emerj, 3 February 2019, https://emerj.com/ai-sector-overviews/where-healthcares-big-data-actually-comes-from/
36 Reece, A. G.; C. M. Danforth; “Instagram Photos Reveal Predictive Markers of Depression,” EPJ Data Science, vol. 6, iss. 15, 2017, https://epjdatascience.springeropen.com/articles/10.1140/epjds/s13688-017-0110-z
37 Beebe, J.; “What You Don’t Know About Your Health Data Will Make You Sick,” Fast Company, 22 March 2019, https://www.fastcompany.com/90317471/what-you-dont-know-about-you-rhealth-data-privacy-will-make-you-sick
38 Seltzer, E.; J. Goldshear; S. C. Guntuku; D. Grande; D. A. Asch; E. V. Klinger; R. M. Merchant; “Patients’ Willingness to Share Digital Health and Non-Health Data for Research: A Cross-Sectional Study,” BMC Medical Informatics and Decision Making, vol. 19, article 157, 2019, https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-019-0886-9
39 Adams, A.; M. Shankar; H. Tecco; “50 Things We Now Know About Digital Health Consumers,” Rock Health, 2016, https://rockhealth.com/reports/digital-health-consumer-adoption-2016/
40 Grundy, Q.; K. Chiu; F. Held; A. Continella; L. Bero; R. Holz; “Data Sharing Practices of Medicines Related Apps and the Mobile Ecosystem: Traffic, Content, and Network analysis,” BMJ, vol. 364, iss. 1920, 20 March 2019, https://www.bmj.com/content/364/bmj.l920
41 Rosenbaum, J.; “Look! Out the Window! It’s a Peeping Tom! No, It’s Google Street View,” Legal Bytes, 20 December 2010, https://legalbytes.com/look-out-the-window-its-a-peeping-tom-no-its-google-street-view/
42 Caulkin, S.; “Ethics and Profits Do Mix,” The Guardian, 20 April 2003, https://www.theguardian.com/business/2003/apr/20/globalisation.corporateaccountability
43 Nebeker, C.; J. Torous; R. J. Bartlett Ellis; “Building the Case for Actionable Ethics in Digital Health Research Supported by Artificial Intelligence,” BMC Medicine, vol. 17, iss. 137, 17 July 2019, https://bmcmedicine.biomedcentral.com/articles/10.1186/s12916-019-1377-7
44 Staccini, P.; A. Y. S. Lau; “Findings From 2017 on Consumer Health Informatics and Education: Health Data Access and Sharing,” NCBI, vol. 27, iss. 1, 27 August 2018, https://www.ncbi.nlm.nih.gov/pubmed/30157519
45 Kostkova, P.; H. Brewer; S. de Lusignan; E. Fottrell; B. Goldacre; G. Hart; P. Koczan; P. Knight; C. Marsolier; R. A. McKendry; E. Ross; A. Sasse; R. Sullivan; S. Chaytor; O. Stevenson; R. Velho; J. Tooke; “Who Owns the Data? Open Data for Healthcare,” Frontiers in Public Health, 17 February 2016, https://www.frontiersin.org/articles/10.3389/fpubh.2016.00007/full
46 Op cit Beebe

Guy Pearce, CGEIT
Has served on governance boards in banking, financial services and a not for profit, and as chief executive officer of a financial services organization. He has played an active role in multiple enterprise digital transformation programs since 1999, experiences that led him to create a digital transformation course for the University of Toronto SCS (Ontario, Canada). He also has more than a decade of experience in data governance and IT governance, and readily shares these experiences through publications and at conferences. He is the recipient of the ISACA 2019 Michael Cangemi Best Author award for contributions to the field of IT governance. Consulting in digital transformation and governance, he serves as chief digital transformation officer at Convergence.

Sandra Ketchen, P.Eng, ICD.D
Is the senior vice president (SVP) of operations for SE Health. She is responsible for SE Health’s national operating network of more than 7,000 nurses, personal support workers and other healthcare professionals delivering care. She is accountable for client satisfaction and quality of care while driving continuous improvement and efficiencies. Before SE Health, Ketchen spent more than 20 years in global operations roles, most recently as SVP of consumer products for ATS. She spent 16 years with Celestica Inc.—before that, holding various senior management roles, culminating as vice president and general manager for its international healthcare business. She has won awards for customer support and client satisfaction, and she serves on the advisory board of Convergence.