“…when clinical data are used to provide care, the primary purpose for acquiring the data is fulfilled. At that point, clinical data should be treated as a form of public good, to be used for the benefit of future patients.” From Mar. 2029 Radiological Society of North America’s Special Report[i]
How much does your privacy matter to you? It seems to be an increasingly scarce commodity. Your credit/debit card purchases are entered into a database so issuers can track your spending habits; your online activity (search engines, browsers, social media etc.) is monitored in order to tailor things like ads based on your usage patterns; your cell phone can be geographically tracked. Do you have a Smart Home device? It’s like a virtual glass house, revealing more than you know.
It is Artificial Intelligence (AI) that enables access to your personal information. Within nanoseconds, AI programs collect, digest, analyze trends and issue reports on your part of a vast amount of consumer data. Based on AI’s analysis, decision-makers develop strategies to enhance their bottom line based on communal preferences. For example, suppose you live in a suburb where 72% of households don’t eat red meat; you, on the other hand, love that grass-fed ribeye steak sizzling on your grill. Will there come a day when you can’t buy your favorite cut at the local grocer?
Most of us shrug off such possibilities as an acceptable trade-off for the benefits of the devices and apps that make our individual lives more convenient or fulfilling. A minority of us, however, worry about “Big Brother” taking over our lives. Who is this vague, conspiratorial Big Brother? Corporations? The Government? Some science fiction interstellar entity? It doesn’t matter. Dwindling privacy has become a fact of life, and this reality may soon absorb your medical records into it.
AI and healthcare information
For over two decades, every American’s medical/healthcare privacy has been protected by Federal law. The Health Insurance Portability and Accountability Act (HIPAA). This privacy rule “… covers all individually identifiable health information that is created, stored, maintained, or transmitted by a HIPAA covered entity or business associate of a HIPAA covered entity.”[ii] It applies to the information whether paper records, imaging results (films), electronic medical records and even spoken information.
And yet, despite the protection of law, the American Medical Association’s Journal of Ethics cautions us, “AI technology has tremendous capability to threaten patient privacy and confidentiality.[iii] Take, for example, Facial Recognition Technology (FRT), which involves machine learning programs that can map facial patterns, create a template, and compare it with others to generate positive identification. This can be useful in medicine not only for diagnosing rare genetic disorders and predicting health characteristics, it can also be used to identify and monitor elderly patients for safety or medication adherence. This steps into a gray area in terms of informed consent, and raises ethical questions about protecting privacy.
It’s conceivable that some uses of FRT would actually violate HIPAA, which protects anything that might personally identify an individual patient, including full face photos. “The idea that a photo can reveal private health information is relatively new, and privacy regulations and practices are still catching up.”[iv]
From the individual to the mass of humanity
There are many other ways in which AI applications could inadvertently reveal information specific to an individual patient, but I want to bring up the idea that there are two fundamental reasons to collect patient information:
- For the benefit of each patient – The health records of each individual patient form that person’s medical history. It is there to serve the best interests of the patient, and only that patient. Any other use constitutes an ethical abuse and a legal violation.
- For the benefit of humanity – However, there are those who make a strong philosophical case for the paradox of patient data entrusted to AI: “… that all individuals and entities with access to clinical data become data stewards, with fiduciary (or trust) responsibilities to patients to carefully safeguard patient privacy, and to the public to ensure that the data are made widely available for the development of knowledge and tools to benefit future patients.”[v]
The second reason walks a tightrope between individual and global best interests. Prior to AI, countless research articles out of academic or other clinical settings have been published each year in which the identify of patient participants is protected. Data is collected, analyzed and published yet the anonymity of the participants is assured. No names are named, no birthdates listed, and demographics are reported as groups or categories. One research center can’t access another’s patient list, only the data that is publicly published.
Now AI has ushered in a new age of amassing, centralizing and sharing very specific, identifiable patient data. This could be a tremendous benefit to global wellness provided individuals remain anonymous. It could also bring risk of leaks, hacks, and abuses by unscrupulous personnel with access to the AI programs. These are worrisome possibilities, with ramifications for individuals that can only be imagined. Therefore, AI must be designed with safeguards built in to protect the legal privacy rights.
Part 6 will address how AI might alter the relationship between doctor and patient. Stay tuned.
NOTE: This content is solely for purposes of information and does not substitute for diagnostic or medical advice. Talk to your doctor if you are experiencing pelvic pain, or have any other health concerns or questions of a personal medical nature.
[i] Larson DB, Magnus DC, Lungren MP et al. Ethics of Using and Sharing Clinical Imaging Data for Artificial Intelligence: A Proposed Framework. Special Report by Radiological Society of North America. Mar. 24, 2020. https://pubs.rsna.org/doi/10.1148/radiol.2020192536
[ii] https://www.hipaajournal.com/what-does-hipaa-cover/. March 1, 2018.
[iii] AMA Journal of Ethics. Vol. 21, No. 2: E119-197. February 2019.
[iv] AMA Journal of Ethics, ibid.
[v] Larson et al. ibid.