Selbst & Barocas on Unfair Artificial Intelligence: How FTC Intervention Can Overcome the Limitations of Discrimination Law

Andrew D. Selbst (UCLA School of Law) and Solon Barocas (Microsoft Research; Cornell University) have posted “Unfair Artificial Intelligence: How FTC Intervention Can Overcome the Limitations of Discrimination Law” (171 University of Pennsylvania Law Review, forthcoming). Here is the abstract:

The Federal Trade Commission has indicated that it intends to regulate discriminatory AI products and services. This is a welcome development, but its true significance has not been appreciated to date. This Article argues that the FTC’s flexible authority to regulate ‘unfair and deceptive acts and practices’ offers several distinct advantages over traditional discrimination law when applied to AI. The Commission can reach a wider range of commercial domains, a larger set of possible actors, a more diverse set of harms, and a broader set of business practices than are currently covered or recognized by discrimination law. For example, while most discrimination laws can address neither vendors that sell discriminatory software to decision-makers nor consumer products that work less well for certain demographic groups than others, the Commission could address both. The Commission’s investigative and enforcement powers can also overcome many of the practical and legal challenges that have limited plaintiffs’ ability to successfully seek remedies under discrimination law. The Article demonstrates that the FTC has the existing authority to address the harms of discriminatory AI and offers a method for the Commission to tackle the problem, based on its existing approach to data security.

Gursoy, Kennedy & Kakadiaris on A Critical Assessment of the Algorithmic Accountability Act of 2022

Furkan Gursoy (University of Houston), Ryan Kennedy (same), and Ioannis Kakadiaris (same) have posted “A Critical Assessment of the Algorithmic Accountability Act of 2022” on SSRN. Here is the abstract:

On April 3rd, 2022, a group of US lawmakers introduced the Algorithmic Accountability Act of 2022. The legislation proposed by Democrats requires sizable entities to conduct impact assessments for automated decision systems deployed for a set of critical decisions. The bill comes at an opportune moment because algorithms have an increasingly substantial influence on human lives, and topics such as transparency, explainability, fairness, privacy, and security are receiving growing attention from researchers and practitioners. This article examines the bill by (i) developing a critical summary and (ii) identifying and presenting ten ambiguities and potential shortcomings for debate. This study paves the way for further discussions to shape algorithmic accountability regulations.

Abraha on The Role of Article 88 GDPR in Upholding Privacy in the Workplace

Halefom H. Abraha (University of Oxford) has posted “A pragmatic compromise? The role of Article 88 GDPR in upholding privacy in the workplace” on SSRN. Here is the abstract:

The distinct challenges of data processing at work have led to long-standing calls for sector-specific regulation. This leaves the European legislature with a dilemma. While the distinct features of employee data processing give rise to novel issues that cannot adequately be addressed by an omnibus data protection regime, a combination of legal, political, and constitutional factors have hindered efforts towards adopting harmonised employment-specific legislation at the EU level. The ‘opening clause’ in Art. 88 GDPR aims to square this circle. It aims to ensure adequate and consistent protection of employees while also promoting regulatory diversity, respecting national peculiarities, and protecting Member State autonomy. This paper examines whether the opening clause has delivered on its promises. It argues that while the compromise has delivered on some of its promises in promoting diverse and innovative regulatory approaches, it also runs counter to the fundamental objectives of the GDPR itself by creating further fragmentation, legal uncertainty, and inconsistent implementation, interpretation, and enforcement of data protection rules.

Goforth on Critiquing the SEC’s On-Going Efforts to Regulate Crypto ExchangesGoforth on

Carol R. Goforth (U Arkansas Law) has posted “Critiquing the SEC’s On-Going Efforts to Regulate Crypto Exchanges” (14 William & Mary Business Law Review (Forthcoming 2022)) on SSRN. Here is the abstract:

Despite the so-called “Crypto Winter” in the spring of 2022, which saw a deep plunge in global crypto markets, interest in the appropriate way to develop, use, and regulate cryptoassets and crypto-based businesses continues to be high. In the U.S., a Presidential Executive Order and multiple bills that seek to tackle various issues of crypto regulation are regularly highlighted in the news, suggesting the appropriate treatment of crypto is a growing national priority. Despite these discussions, which tend to focus on finding a balanced way to regulate those within the industry without stifling the technology, the Securities and Exchange Commission (SEC) continues to seek to asset its jurisdiction unilaterally. A pending proposal from the SEC, misleadingly characterized as an attempt to regulate trading in government securities, would broaden the definition of “exchange” with potentially destructive consequences. This Article carefully considers the existing definition of “exchange” under the Securities Exchange Act of 1934 (the ’34 Act), and then examines a proposal from the Commission that would substantially broaden the current interpretation to reach a much larger group of persons involved in trading cryptoassets without adding clarity or a path to compliant operation for such persons. It then evaluates why the proposal creates problems, identifying a number of such issues, before concluding that a better approach would be to allow the legislative process to play out.


Hollis & Raustilia on The Global Governance of the Internet

Duncan B. Hollis (Temple University Law) and Kal Raustiala (UCLA Law) have posted “The Global Governance of the Internet” (in Duncan Snidal & Michael N. Barnnett (eds.), The Oxford Handbook of International Institutions (2023)) on SSRN. Here is the abstract:

This essay surveys Internet governance as an international institution. We focus on three key aspects of information and communication technologies. First, we highlight how, unlike natural commons such as sea or space, digital governance involves a socio-technical system with a man-made architecture reflecting particular and contingent technological choices. Second, we explore how private actors historically played a significant role in making such choices, leading to the rise of existing “multistakeholder” governance frameworks. Third, we examine how these multistakeholder structures favored by the U.S. and its technology companies have come under increasing pressure from multilateral competitors, particularly those championed by China under the banner of “internet sovereignty,” as well as more modest efforts by the European Union to employ an approach akin to “embedded liberalism” for digital governance. The future of the Internet turns on how what we term these Californian, Chinese, and Carolingian visions of Internet governance compete, evolve, and interact. Thus, this essay characterizes Internet governance as a heterogenous, dynamic, multi-layered set of principles, regimes and institutions—a regime complex—that not only governs cyberspace today, but has adapted and transformed along pathways that may serve as signposts for international institutions that regulate other global governance challenges.


Lubin on The Law and Politics of Ransomware

Asaf Lubin (Indiana U Maurer School of Law; Berkman Klein; Yale ISP; Federmann Cybersecurity Center, Hebrew U Law) has posted “The Law and Politics of Ransomware” (Vanderbilt Journal of Transnational Law, Vol. 55, 2022) on SSRN. Here is the abstract:

What do Lady Gaga, the Royal Zoological Society of Scotland, the city of Valdez in Alaska, and the court system of the Brazilian state of Rio Grande do Sul all have in common? They have all been victims of ransomware attacks, which are growing both in number and severity. In 2016, hackers perpetrated roughly 4,000 ransomware attacks a day worldwide, a figure which was already alarming. By 2020, however, “attacks leveled out at 20,000 to 30,000 per day in the US alone.” That is a ransomware attack every 11 seconds, each of which cost victims on average 19 days of network downtime and a payout of over $230,000. In 2021, global costs associated with ransomware recovery exceeded $20 billion.

This Article offers an account of the regulatory challenges associated with ransomware prevention. Situated within the broader literature on underenforcement, the Article explores the core causes for the limited criminalization, prosecution, and international cooperation that have exacerbated this wicked cybersecurity problem. In particular, the Article examines the resource allocation, forensic, managerial, jurisdictional, and informational challenges that have plagued the fight against digital extortions in the global commons.

To address these challenges the Article makes the case for the international criminalization of ransomware. Relying on existing international regimes––namely, the 1979 Hostage Taking Convention, the 2000 Convention Against Transnational Crime, and the customary prohibition against the harboring of terrorists––the Article makes the claim that most ransomware attacks are already criminalized under existing international law. In fact, the Article draws on historical analysis to portray the criminalization of ransomware as a “fourth generation” in the outlawry of Hostis Humani Generis (enemies of mankind).

The Article demonstrates the various opportunities that could arise from treating ransomware gangs as international criminals subject to universal jurisdiction. The Article focuses on three immediate consequences that could arise from such international criminalization: (1) Expanding policies for naming and shaming harboring states; (2) Authorizing extraterritorial cyber enforcement and prosecution; and (3) Advancing strategies for strengthening cybersecurity at home.

Grafenstein on the Various Draft Data Acts of the EU

Max Grafenstein (Humboldt Institute for Internet and Society, Berlin University of the Arts) has posted “Reconciling Conflicting Interests in Data through Data Governance. An Analytical Framework (and a Brief Discussion of the Data Governance Act Draft, the Data Act Draft, the AI Regulation Draft, as well as the GDPR)” on SSRN. Here is the abstract:

In the current European debate on how to tap the potential of data-driven innovation, data governance is seen to play a key role. However, if one tries to understand what the discussants actually mean by the term data governance, one quickly gets lost in a semantic labyrinth with abrupt dead ends: Either the concrete meaning remains unclear or when an explicit definition is given, it hardly describes the challenges, which are considered essential in this article, at least within the highly regulated EU Single Market. The terminological and conceptual ambiguity makes it difficult to adequately describe certain challenges for data governance and to compare corresponding solution mechanisms in terms of their conditions for success. This article, therefore, critically examines and further develops elements of data governance concepts currently discussed in Information Systems literature to better capture challenges for data governance with particular respect to data-driven innovation and conflicting interests, especially those protected by legal rights. To reach this aim, the article elaborates on a refined data governance framework that reflects practical experience and theoretical considerations particularly from the field of data protection and regulation of innovation. Against this background, the outlook briefly assesses the most relevant current draft laws of the EU Commission, namely: the Data Governance Act, the Data Act and the AI Regulation (especially the last one concerning the General Data Protection Regulation).

Murtazashvili et al. on Blockchain Networks as Knowledge Commons

Ilia Murtazashvili (U Pitt – GSPIA), Jennifer Brick Murtazashvili (same), Martin B. H. Weiss (U Pitt – School of Computing and Information), and Michael J. Madison (U Pitt Law) have posted “Blockchain Networks as Knowledge Commons” (International Journal of the Commons, Vol. 16, p. 108, 2022) on SSRN. Here is the abstract:

Researchers interested in blockchains are increasingly attuned to questions of governance, including how blockchains relate to government, the ways blockchains are governed, and ways blockchains can improve prospects for successful self-governance. Our paper joins this research by exploring the implications of the Governing Knowledge Commons (GKC) framework to analyze governance of blockchains. Our novel contributions are making the case that blockchain networks represent knowledge commons governance, in the sense that they rely on collectively-managed technologies to pool and manage distributed information, illustrating the usefulness and novelty of the GCK methodology with an empirical case study of the evolution of Bitcoin, and laying the foundation for a research program using the GKC approach.

Botero Arcila on The Case for Local Data Sharing Ordinances

Beatriz Botero Arcila (Sciences Po Law; Harvard Berkman Klein) has posted “The Case for Local Data Sharing Ordinances”
(William & Mary Bill of Rights Journal) on SSRN. Here is the abstract:

Cities in the US have started to enact data-sharing rules and programs to access some of the data that technology companies operating under their jurisdiction – like short-term rental or ride hailing companies – collect. This information allows cities to adapt too to the challenges and benefits of the digital information economy. It allows them to understand what their impact is on congestion, the housing market, the local job market and even the use of public spaces. It also empowers them to act accordingly by, for example, setting vehicle caps or mandating a tailored minimum pay for gig-workers. These companies, however, sometimes argue that sharing this information attempts against their users’ privacy rights and their privacy rights, because this information is theirs; it’s part of their business records. The question is thus what those rights are, and whether it should and could be possible for local governments to access that information to advance equity and sustainability, without harming the legitimate privacy interests of both individuals and companies. This Article argues that within current Fourth Amendment doctrine and privacy law there is space for data-sharing programs. Privacy law, however, is being mobilized to alter the distribution of power and welfare between local governments, companies, and citizens within current digital information capitalism to extend those rights beyond their fair share and preempt permissible data-sharing requests. The Article warns that if the companies succeed in their challenges, privacy law will have helped shield corporate power from regulatory oversight, while still leaving individuals largely unprotected and submitting local governments further to corporate interests.

Richards on The GDPR as Privacy Pretext and the Problem of Co-Opting Privacy

Neil M. Richards (Washington U Law) has posted “The GDPR as Privacy Pretext and the Problem of Co-Opting Privacy” (73 Hastings Law Journal 1511 (2022)) on SSRN. Here is the abstract:

Privacy and data protection law’s expansion brings with it opportunities for mischief as privacy rules are used pretextually to serve other ends. This Essay examines the problem of such co-option of privacy using a case study of lawsuits in which defendants seek to use the EU’s General Data Protection Regulation (“GDPR”) to frustrate ordinary civil discovery. In a series of cases, European civil defendants have argued that the GDPR requires them to redact all names from otherwise valid discovery requests for relevant evidence produced under a protective order, thereby turning the GDPR from a rule designed to protect the fundamental data protection rights of European Union (EU) citizens into a corporate litigation tool to frustrate and delay the production of evidence of alleged wrongdoing.

This Essay uses the example of pretextual GDPR use to frustrate civil discovery to make three contributions to the privacy literature. First, it identifies the practice of defendants attempting strategically to co-opt the GDPR to serve their own purposes. Second, it offers an explanation of precisely why and how this practice represents not merely an incorrect reading of the GDPR, but more broadly, a significant departure from its purposes—to safeguard the fundamental right of data protection secured by European constitutional and regulatory law. Third, it places the problem of privacy pretexts and the GDPR in the broader context of the co-option of privacy rules more generally, offers a framework for thinking about such efforts, and argues that this problem is only likely to deepen as privacy and data protection rules expand through the ongoing processes of reform.