Project Nightingale: The Need to Connect Data with Dignity

Author

Ann Mongoven

Publish date

Tag(s): Legacy post
Topic(s): Ethics Health Care Health Regulation & Law Informed Consent Philosophy & Ethics Privacy Professional Ethics

by Ann Mongoven, Ph.D.

[Editor’s note: Breaking news about a new data contract between Amazon and the U.K’s National Health Service underscores the need for ethical reflection on commercial/non-profit collaborations in health data analytics. Unlike the Google-Ascension collaboration discussed below, the NHS intends to remove patient-identifiers from records shared with Amazon. Yet questions remain about whether there are any limits on Amazon’s ability to use the data in the development of its own commercial products.]

Informed by a Google employee-whistleblower, the Wall Street Journal recently broke a story about a controversial collaboration between Catholic healthcare giant Ascension Health and Google. Ascension hospitals provided Google identifiable medical data for cloud storage and analysis without the knowledge or consent of patients. Ironically named “Project Nightingale,” this clandestine data-sharing violated both specific commitments of Catholic healthcare and general expectations of American patients.

The Department of Health and Human Services has launched an investigation into whether the contract between Ascension and Google violates privacy law. In advance of results, it is not clear whether it does. HIPAA, the law governing health privacy, allows record-sharing between health systems and service contractors. On the one hand, Google is a service contractor. On the other, it is not the kind of contractor—such as insurers or billing firms—that the law originally envisioned. Artificial intelligence, with its voracious appetite for “big data,” is simply a new legal and social terrain.

Legalities aside, Project Nightingale violated patient trust in ways that have implications not only for Catholic healthcare, but also for the American healthcare sector at large. Catholic hospitals in the United States treat a million patient-admissions per year. Ascension Health—with its more than 200 hospitals and nursing homes—is one of the largest non-profit health systems in the country. Both the health-oriented and commercial incentives for mass data-sharing evident in the Ascension-Google collaboration are widely pervasive in the current healthcare landscape.

Attempting to reassure patients on its website after the media-storm broke, Ascension acknowledged its contract with Google, explaining it aims “to improve patient and provider experience and to advance [Ascension’s] mission to provide compassionate, personalized care, especially to the most vulnerable.” Clearly those are worthy goals.

However, patients generally expect to be informed about, and have choices about, with whom their personal health records are shared. Ascension patients and the general public expressed alarm after learning of the secret sharing of records that included names, addresses, and other personal identifiers. Shouldn’t patients know their data has been given to Google, where it potentially could be mined and commercialized? Shouldn’t they be informed and asked for permission, or given the right to opt out—or at least told that Google has their data, and why?

Ascension stumbled ethically by failing to perceive the relationship between data and dignity.

According to the U.S. Conference of Catholic Bishops’ Ethical and Religious Directives for Catholic Healthcare, Catholic healthcare should be rooted in patient-focused recognition of the “dignity of the human person” (preamble; directive 9). The Directives explicitly recognize privacy and confidentiality of medical information as integral to patient dignity (directive 34). This appeal to dignity accords with a widely-endorsed principle in secular medical ethics, the principle of respect for patient autonomy. Because medical data is so personal, how institutions handle data is constitutive of how they treat persons, and how they honor human dignity.

Lack of transparency about data practices harms patients who rightly feel their trust has been breached. Resultant mistrust can generate ripple effects, decreasing general trust in healthcare institutions, or decreasing social support for the development of data-analytics that in fact offer great promise. Sadly, the evasion of transparency by Ascension may set back the legitimate advancement of health and health-technology that it sought through its collaboration with Google.

Google has data-analytic contracts with several health systems. (So too do a few other large-scale computing companies.) These contracts vary in important ways: whether patients are informed about the transfer of health data to third parties; whether patients have the opportunity to opt-in or opt-out; whether patient data is de-identified before cloud storage; whether the contracting health system imposes limits on secondary access to or use of the data.

One aim of the Google-Ascension collaboration is the development of a unique patient-portal for Ascension clinicians, a laudable goal that might require identifiable medical records. But critics worry, despite Google’s public denials, that Google also may intend to use Ascension’s 2 million patient records to train artificial intelligence for its planned commercial medical search tool. Google has major financial incentives to obtain mass patient data other than serving the direct needs of the client health system and its patients.

Societal health interests and private commercial interests co-mingle in the current development of health data-analytics and artificial intelligence. The shaking of patient and public trust in the Google-Ascension case urges an articulation of emerging new ethical duties in the age of “big data”:

  • Healthcare providers should understand data-handling practices within their institutions and be prepared to address patient questions. They should challenge the institutions in which they work if they feel data practices pose a potential threat to patient trust. For Catholic providers, this obligation accords with duties of conscience as those are understood within Catholic moral theory. (Media reports indicate that many Ascension clinicians were unaware of the data-sharing with Google, while others expressed concerns to Ascension that went unanswered. Yet it was ultimately a Google employee who blew the whistle.)
  • Hospitals should refuse to cede ethical decision-making over data policy to the larger health systems of which they are part. Hospitals, which are closer to their own patient pools than larger health systems, should develop independent governance mechanisms to address fair data policy, and to ratify or decline to ratify policies proposed by their wider system. For Catholic hospitals, this responsibility accords with the principle of subsidiarity in Catholic moral theology. The principle of subsidiarity urges that matters of human organization be addressed by the lowest level of competent authority.
  • Health Systems. Health systems should consider data-sharing and data-analytic practices with respect to patient and community trust, as well as with respect to the legitimate goals of increasing efficiency, knowledge, and health. They should develop mechanisms of communication with their member-hospitals in developing data policy. They should also develop strategies of meaningful patient and community engagement to inform trustworthy data practices. Catholic health systems should consider data-analytics policy in conjunction not only with general bioethical principles, but also with moral ideals of the Catholic social justice tradition and its special vocation to the vulnerable.
  • The Catholic Church. Ethical exploration of data analytics and artificial intelligence by the Catholic Church and by Catholic institutions should include collaboration with leaders in Catholic healthcare. It should also address translation of ethical ideals to concrete challenges of applied ethics, including the delivery of health and human services by Catholic organizations. Currently, at the direction of Pope Francis, Vatican organs such as the Pontifical Council for Culture and the Pontifical Academy for Life are facilitating high-level ethical discourseamong theologians, ethicists, and scientists on ethical opportunities and challenges of data analytics and artificial intelligence. Similar explorations are underway at Catholic universities around the world. The Ascension-Google case provides a clarion call for these forums to include in-the-trenches providers of Catholic human services, and to consider practical implications for organizational management.

Patients–who may be sick, poor, or lacking data savvy—are not responsible for sleuthing their providers’ data policies. The responsibility for disclosure and responsible patient-engagement lies squarely on the providers’ side. Nonetheless, “defensive driving” strategies are advisable for patients capable of them. Patient advocates should teach patients how to ask routine questions about how their personal health records are handles, and explain why that is important. If enough patients cultivate such habits, additional moral pressure will be exerted on providers to meet their ethical responsibilities discerning proper ends and means of data policy, and disclosing them.

Catholic hospitals maintain a special mission to treat all in need with patient-centered dignity. Catholic health systems disproportionately shoulder the financial burden of caring for un- and under-insured Americans and have been vociferous in calling for greater health justice. They have an obligation to consider relationships between data, dignity, vulnerability, trust, access and equity in the brave new world of enormous data sets and artificial intelligence.

Hopefully, the bad start of the Google-Ascension case will encourage more discerning responses to the ethical opportunities and dangers of new data-analytics by Catholic institutions. Catholic healthcare could draw reflectively on its historical mission to become a leader, rather than a follower, in patient-oriented data analytics. It could design effective new data-strategies to care for the sick with dignity.

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Privacy Policy. By closing this message, you are consenting to our use of cookies.