Ethical Concerns About Reidentification of Individuals from MRI Neuroimages

Main Article Content

Elizaveta Garbuzova

Abstract

Photo by Umanoide on Unsplash


INTRODUCTION


In the US, more than three million people have magnetic resonance imaging (MRI) scans daily.[1] MRIs are the most common way to image the brain and detect tumors, brain injuries, strokes, aneurysms, sclerosis, and other conditions.[2] After the procedure, the images are usually kept by the hospital or other medical facility. Thousands of neuroimages are shared among the researchers to increase the data available for studies and enable scientific discovery.


l.     Reidentification Using AI and Facial Recognition


The possibility of the reidentification of an individual using AI and facial recognition technology is an ethical concern. AI can reidentify a person from the scan by reconstructing the face.[3] Despite the steps taken to deidentify the patient, such as removing or changing the name, age, identification number, gender, and postal code, according to the Mayo Clinic researchers, the software still can reidentify individuals from deidentified scans.[4] The Mayo Clinic’s study shows that AI was 83 percent successful in reidentifying the person by analyzing the scans.[5] In most cases, scientists can share anonymized information without the patient's consent.[6] The only way to anonymize the scan and to remove any personal data that can lead to reidentification is to blur the image, which would compromise the ability to read and analyze the scan, defeating the purpose of sharing the data.


One of the fundamental values in the relationship between a physician and a patient is respect for autonomy and privacy. When subjects enter a clinical study, the researchers guarantee confidentiality. Yet, it seems impossible to protect a patient's privacy when sharing brain scans. Potential MRI image reidentification impacts research participants, healthcare providers, and government regulating bodies. I suggest that modified informed consent and the introduction of data misuse liability could improve accountability to the patient while still allowing societal health benefits associated with data sharing for research and education. Furthermore, I advocate patients who agree to data sharing be granted a small financial reward as a sign of recognition of their contribution to the medical field.


ll.     Privacy


Privacy is a fundamental human right and especially important when sensitive medical information is shared. Deidentification can expose a research subject to unfortunate events, including but not limited to increasing cost of insurance, discrimination, and humiliation.[7] “This identification would result in an infringement of privacy that could include diagnoses, cognitive scores, genetic data, biomarkers, results of other imaging, and participation in studies or trials.”[8]


In the same study that found the 83 percent successful reidentification, AI wrongly identified 15 percent of the patients.[9] Misidentification may lead researchers to overlook relevant information. Suppose a certain type of tumor develops predominantly in men, and patient X is a man and has the tumor and is not aware of it. If AI incorrectly labeled patient X female and skipped the tumor detection analysis of her scan, the AI would fail to detect her condition. Internal bias in AI may harm the patients even further.


lll.     Who Benefits from Sharing Scans?


Patients who are willing to risk their privacy are not directly benefiting from sharing their brain images in their health care. The research is usually designed for the common good and for improving medical care and diagnosis generally. Often, current patients benefit future patients.  Stimulating the development of treatment for future patients is an important societal benefit, but patients should know they may not benefit from sharing scans.[10] (In some cases, patients will benefit from research involving their own scans.)


lV.     Research and the Importance of Sharing Data


Data sharing is essential for the research and development of new medical treatments that would benefit current and future patients. I argue that patients have a duty to contribute to this process by providing images because they are beneficiaries of data derived from previous patients. The existing images increase the sample size,[11]  producing more valid and generalizable results. If patient X is using a certain cancer treatment, patient X is arguably indebted to past patients who contributed health data to the medical research. Because of such a gracious act by another patient, X is benefitting and thus has a duty to assist future patients by sharing data. It is wrong to be a free rider in society. In addition to past patient research, institutions use both public and private resources to train the physicians who treat patient X and to build hospitals where patient X seeks treatment


While the argument I promote is likely not strong enough to compel participation in risky clinical trials, and it does not negate a moral right to refuse, it promotes participation in scientific research as a moral good, and sometimes, an obligation.


Despite the low risk that the data would not be protected to the highest degree, patients ought to share it with scientists to give back to society. Furthermore, scientists may not have enough incentive to share the data.[12] Deidentification is a time-consuming manual process that society underappreciates. Despite the overall benefit to the community and the scientists’ dedication and curiosity, the deidentification process is an obstacle to data sharing that would benefit society. Recognizing the risk to patient privacy and the difficulty physicians already face, I argue that a better balance should ensure privacy while incentivizing sharing scans.


V.     Establishing Liability: A Rule to Ensure Proper Use and Prohibit Reidentification


A government agency such as Health Canada or the Federal Drug Administration (FDA) is interested in promoting clinical studies while protecting the research subjects. Yet, Health Canada recognizes that it is impossible to eliminate the risk of reidentification.[13] Direct personal identifiers, of course, pose a higher risk to the research subject; yet there should not be a blanket prohibition of neuroimaging share. Currently, the risk is diminished by the requirement to sign a data use agreement.[14] Currently, there is no universal rule on the liability of reidentification breach; rather, there are numerous suggestions to ensure ethical data use. The European and North American Multisociety suggests that release of information and data use agreements (DUA) are critical tools in making it clear what various stakeholders are permitted or prohibited to do with the data.[15] Yet, the statement does not impose any liability for DUA breach, rather it only goes as far to suggest that data sharing should be traceable.[16] I suggest introducing and establishing a clear liability for data misuse. A controlling body, such as the FDA or Health Canada, needs to continuously check data use compliance and, in case of illegal use, apply appropriate penalties. This will strengthen the research subjects’ protection and give more reason for researchers to follow the rules.


CONCLUSION


             As mentioned above, data sharing contributes to society in the long run. Therefore, the government must provide incentives to both scientists and patients to contribute to the medical field. The risks of reidentification need to be clearly outlined in the informed consent process, and subjects should be financially rewarded for their images. I suggest a financial reward since there are direct benefits to current patients, yet the scientists and others cut costs by accessing existing data. Thus, to be just, the savings from the industry cost-cutting needs to be given to people who are risking their privacy. The financial reward can be as little as five dollars, enough to recognize the good deed of contribution. The neuroimaging data share comes at costs to both researchers and trial participants, yet the burdens and risks can be decreased with the efforts of governmental bodies.


-


[1] Conor Stewart, “MRI Scan Volume by Facility Type U.S. 2016 and 2017,” Statista, March 24, 2021, https://www.statista.com/statistics/820927/mri-scans-number-in-us-by-facility-type/.


[2] “MRI,” Mayo Clinic (Mayo Foundation for Medical Education and Research, August 3, 2019), https://www.mayoclinic.org/tests-procedures/mri/about/pac-20384768.


[3] Gina Kolata, “You Got a Brain Scan at the Hospital. Someday a Computer May Use It to Identify You.,” The New York Times (The New York Times, October 23, 2019), https://www.nytimes.com/2019/10/23/health/brain-scans-personal-identity.html.


[4] “Mayo Clinic Studies Patient Privacy in MRI Research,” Mayo Clinic (Mayo Foundation for Medical Education and Research), accessed July 20, 2021, https://newsnetwork.mayoclinic.org/discussion/mayo-clinic-studies-patient-privacy-in-mri-research/.


[5] “Mayo Clinic Studies Patient Privacy in MRI Research,” Mayo Clinic (Mayo Foundation for Medical Education and Research), accessed July 20, 2021, https://newsnetwork.mayoclinic.org/discussion/mayo-clinic-studies-patient-privacy-in-mri-research/.


[6] Tonya White, Elisabet Blok, and Vince D. Calhoun, “Data Sharing and Privacy Issues in Neuroimaging Research: Opportunities, Obstacles, Challenges, and Monsters under the Bed,” Human Brain Mapping, April 2020, https://doi.org/10.1002/hbm.25120, 3.


[7] Jacob L. Jaremko et al., “Canadian Association of Radiologists White Paper on Ethical and Legal Issues Related to Artificial Intelligence in Radiology,” Canadian Association of Radiologists Journal 70, no. 2 (2019): pp. 107-118, https://doi.org/10.1016/j.carj.2019.03.001, 110.


[8]  Letter to the Editor Identification of Anonymous MRI Research Participants with Face-Recognition Software  N Engl J Med 2019; 381:1684-1686, DOI: 10.1056/NEJMc1908881. https://www.nejm.org/doi/full/10.1056/NEJMc1908881


[9] “Mayo Clinic Studies Patient Privacy in MRI Research,” Mayo Clinic (Mayo Foundation for Medical Education and Research), accessed July 20, 2021,


[10] “Mayo Clinic Studies Patient Privacy in MRI Research,” Mayo Clinic (Mayo Foundation for Medical Education and Research), accessed July 20, 2021, https://newsnetwork.mayoclinic.org/discussion/mayo-clinic-studies-patient-privacy-in-mri-research/.


[11] Tonya White, Elisabet Blok, and Vince D. Calhoun, “Data Sharing and Privacy Issues in Neuroimaging Research: Opportunities, Obstacles, Challenges, and Monsters under the Bed,” Human Brain Mapping, April 2020, https://doi.org/10.1002/hbm.25120, 2.


[12] Tonya White, Elisabet Blok, and Vince D. Calhoun, “Data Sharing and Privacy Issues in Neuroimaging Research: Opportunities, Obstacles, Challenges, and Monsters under the Bed,” Human Brain Mapping, April 2020, https://doi.org/10.1002/hbm.25120, 8.


[13] William Parker et al., “Canadian Association of Radiologists White Paper on De-Identification of Medical Imaging: Part 1, General Principles,” Canadian Association of Radiologists Journal 72, no. 1 (March 2020): pp. 13-24, https://doi.org/10.1177/0846537120967349, 19.


[14]  “Mayo Clinic Studies Patient Privacy in MRI Research,” Mayo Clinic (Mayo Foundation for Medical Education and Research), accessed July 20, 2021,


[15] Geis JR, Brady AP, Wu CC, et al. Ethics of Artificial Intelligence in Radiology: Summary of the Joint European and North American Multisociety Statement. Radiology. 2019;293(2):436-440. doi:10.1148/radiol.2019191586.


[16] Geis, et al.

Author Biography

Elizaveta Garbuzova

MS Candidate Columbia University

Article Details

Keywords:
Neuroimage, MRI, Reidentification, AI, Facial Reconstruction, Confidentiality
Section
Perspectives
How to Cite
Garbuzova, E. (2021). Ethical Concerns About Reidentification of Individuals from MRI Neuroimages. Voices in Bioethics, 7. https://doi.org/10.52214/vib.v7i.8662

Most read articles by the same author(s)