Charlotte Tschider (Loyola University Chicago School of Law) has posted “Prescribing Exploitation” (Maryland Law Review, Forthcoming 2023) on SSRN. Here is the abstract:
Patients are increasingly reliant temporarily, if not indefinitely, on connected medical devices and wearables, many of which use artificial intelligence (AI) infrastructures and physical housing that directly interacts with the human body. The automated systems that drive the infrastructures of medical devices and wearables, especially those using complex AI, often use dynamically inscrutable algorithms that may render discriminatory effects that alter paths of treatment and other aspects of patient welfare.
Previous contributions to the literature, however, have not explored how AI technologies animate exploitation of medical technology users. Although all commercial relationships may exploit users to some degree, some forms of health data exploitation exceed the bounds of normative acceptability. The factors that illustrate excessive exploitation that should require some legal intervention include: 1) existence of a fiduciary relationship or approximation of such a relationship, 2) a technology-user relationship that does not involve the expertise of the fiduciary, 3) existence of a critical health event or health status requiring use of a medical device, 4) ubiquitous sensitive data collection essential to AI functionality, 5) lack of reasonably similar analog technology alternatives, and 6) compulsory reliance on a medical device.
This paper makes three key contributions to existing literature. First, this paper establishes the existence of a type of exploitation that is not only exacerbated by technology but creates additional risk by its use. Second, this paper illustrates the need for cross-disciplinary engagement across privacy scholarship and AI ethical goals that typically involve representative data collection for fairness and safety. This paper then illustrates how a modern information fiduciary model can neutralize patient exploitation risk when exploitation exceeds normative bounds of community acceptability.