Ashley on Teaching Law and Digital Age Legal Practice with an AI and Law Seminar

Kevin Ashley (U Pitt Law) has posted “Teaching Law and Digital Age Legal Practice with an AI and Law Seminar: Justice, Lawyering and Legal Education in the Digital Age” (Chicago-Kent Law Review, Vol. 88, p. 783, 2013) on SSRN. Here is the abstract:

A seminar on Artificial Intelligence (“Al”) and Law can teach law students lessons about legal reasoning and legal practice in the digital age. Al and Law is a subfield of Al/computer science research that focuses on designing computer programs—computational models—that perform legal reasoning. These computational models are used in building tools to assist in legal practice and pedagogy and in studying legal reasoning in order to contribute to cognitive science and jurisprudence. Today, subject to a number of qualifications, computer programs can reason with legal rules, apply legal precedents, and even argue like a legal advocate.

This article provides a guide and examples to prepare law students for the digital age by means of an Al and Law seminar. After introducing the science of Artificial Intelligence and its application to law, the paper presents the syllabus for an up-to-date Al and Law seminar. With the Syllabus as a framework, the paper showcases some characteristic Al and Law programs, and illustrates the pedagogically important lessons that Al and Law has learned about reasoning with legal rules and cases, about legal argument and about the digital documents technologies that are becoming available, and even the norm, in legal practice.

Davis on Robolawyers and Robojudges

Joshua P. Davis (University of San Francisco – School of Law) has posted “Of Robolawyers and Robojudges” (Hastings Law Journal, 2022) on SSRN. Here is the abstract:

Artificial intelligence (AI) may someday play various roles in litigation, particularly complex litigation. It may be able to provide strategic advice, advocate through legal briefs and in court, help judges assess class action settlements, and propose or impose compromises. It may even write judicial opinions and decide cases. For it to perform those litigation tasks, however, would require two breakthroughs: one involving a form of instrumental reasoning that we might loosely call common sense or more precisely call abduction and the other involving a form of reasoning that we will label purposive, that is, the formation of ends or objectives. This Article predicts that AI will likely make strides at abductive reasoning but not at purposive reasoning. If those predictions prove accurate, it contends, AI will be able to perform sophisticated tasks usually reserved for lawyers, but it should not be trusted to perform similar tasks reserved for judges. In short, we might welcome a role for robolawyers but resist the rise of robojudges.

Grimmelmann & Windawi on Blockchains as Infrastructure and Semicommons

James Grimmelmann (Cornell Law School; Cornell Tech) & A. Jason Windawi (Princeton University – Department of Sociology) have posted “Blockchains as Infrastructure and Semicommons” (William & Mary Law Review (2023, Forthcoming)) on SSRN. Here is the abstract:

Blockchains are not self-executing machines. They are resource systems, designed by people, maintained by people, and governed by people. Their technical protocols help to solve some difficult problems in shared resource management, but behind those protocols there are always communities of people struggling with familiar challenges in governing their provision and use of common infrastructure.

In this Essay, we describe blockchains as shared, distributed transactional ledgers using two frameworks from commons theory. Brett Frischmann’s theory of infrastructure provides an external view, showing how blockchains provide useful, generic infrastructure for recording transactions, and why that infrastructure is most naturally made available on common, non-discriminatory terms. Henry’s Smith’s theory of semicommons provides an internal view, showing how blockchains intricately combine private resources (such as physical hardware and on-chain assets) with common resources (such as the shared transactional ledger and the blockchain protocol itself). We then detail how blockchains struggle with many the governance challenges that these frameworks predict, requiring blockchain communities to engage in extensive off-chain governance work to coordinate their uses and achieve consensus. Blockchains function as infrastructure and semicommons not in spite of the human element, but because of it.

Husovec & Roche Laguna on the Digital Services Act: A Short Primer

Martin Husovec (London School of Economics – Law School) and Irene Roche Laguna (European Commission) have posted “Digital Services Act: A Short Primer” (in Principles of the Digital Services Act (Oxford University Press, Forthcoming 2023)) on SSRN. Here is the abstract:

This article provides a short primer on the forthcoming Digital Services Act. DSA is an EU Regulation aiming to assure fairness, trust, and safety in the digital environment. It preserves and upgrades the liability exemptions for online intermediaries that exist in the European framework since 2000. It exempts digital infrastructure-layer services, such as internet access providers, and application-layer services, such as social networks and file-hosting services, from liability for third-party content. Simultaneously, DSA imposes due diligence obligations concerning the design and operation of such services, in order to ensure a safe, transparent and predictable online ecosystem. These due diligence obligations aim to regulate the general design of services, content moderation practices, advertising, and transparency, including sharing of information. The due diligence obligations focus mainly on the process and design rather than the content itself, and usually correspond to the size and social relevance of various services. Very large online platforms and very large online search engines are subject to the most extensive risk mitigation responsibilities, which are subject to independent auditing.

Kaminski on Technological ‘Disruption’ of the Law’s Imagined Scene

Margot E. Kaminski (U Colorado Law; Yale ISP; U Colorado – Silicon Flatirons Center for Law, Technology, and Entrepreneurship) has posted “Technological ‘Disruption’ of the Law’s Imagined Scene: Some Lessons from Lex Informatica” (Berkeley Technology Law Journal, Vol. 36, 2022) on SSRN. Here is the abstract:

Joel Reidenberg in his 1998 Article Lex Informatica observed that technology can be a distinct regulatory force in its own right and claimed that law would arise in response to human needs. Today, law and technology scholarship continues to ask: does technology ever disrupt the law? This Article articulates one particular kind of “legal disruption”: how technology (or really, the social use of technology) can alter the imagined setting around which policy conversations take place—what Jack Balkin and Reva Siegal call the “imagined regulatory scene.” Sociotechnical change can alter the imagined regulatory scene’s architecture, upsetting a policy balance and undermining a particular regulation or regime’s goals. That is, sociotechnical change sometimes disturbs the imagined paradigmatic scenario not by departing from it entirely but by constraining, enabling, or mediating actors’ behavior that we want the law to constrain or protect. This Article identifies and traces this now common move in recent law and technology literature, drawing on Reidenberg’s influential and prescient work.

Hartzog & Richards on Legislating Data Loyalty

Woodrow Hartzog (Boston U Law; Stanford Center for Internet and Society) and Neil M. Richards (Washington U Law; Yale ISP; Stanford Center for Internet and Society) have posted “Legislating Data Loyalty” (97 Notre Dame Law Review Reflection 356 (2022)) on SSRN. Here is the abstract:

Lawmakers looking to embolden privacy law have begun to consider imposing duties of loyalty on organizations trusted with people’s data and online experiences. The idea behind loyalty is simple: organizations should not process data or design technologies that conflict with the best interests of trusting parties. But the logistics and implementation of data loyalty need to be developed if the concept is going to be capable of moving privacy law beyond its “notice and consent” roots to confront people’s vulnerabilities in their relationship with powerful data collectors.

In this short Essay, we propose a model for legislating data loyalty. Our model takes advantage of loyalty’s strengths—it is well-established in our law, it is flexible, and it can accommodate conflicting values. Our Essay also explains how data loyalty can embolden our existing data privacy rules, address emergent dangers, solve privacy’s problems around consent and harm, and establish an antibetrayal ethos as America’s privacy identity.

We propose that lawmakers use a two-step process to (1) articulate a primary, general duty of loyalty, then (2) articulate “subsidiary” duties that are more specific and sensitive to context. Subsidiary duties regarding collection, personalization, gatekeeping, persuasion, and mediation would target the most opportunistic contexts for self-dealing and result in flexible open-ended duties combined with highly specific rules. In this way, a duty of data loyalty is not just appealing in theory—it can be effectively implemented in practice just like the other duties of loyalty our law has recognized for hundreds of years. Loyalty is thus not only flexible, but it is capable of breathing life into America’s historically tepid privacy frameworks.

Chen on How Equalitarian Regulation of Online Hate Speech Turns Authoritarian: A Chinese Perspective

Ge Chen (Durham Law School) has posted “How Equalitarian Regulation of Online Hate Speech Turns Authoritarian: A Chinese Perspective” ((2022) Journal of Media Law 14(1)) on SSRN. Here is the abstract:

This article reveals how the heterogeneous legal approaches of balancing online hate speech against equality rights in liberal democracies have informed China in its manipulative speech regulation. In an authoritarian constitutional order, the regulation of hate speech is politically relevant only because the hateful topics are related to regime-oriented concerns. The article elaborates on the infrastructure of an emerging authoritarian regulatory patchwork of online hate speech in the global context and identifies China’s unique approach of restricting political contents under the aegis of protecting equality rights. Ultimately, both the regulation and dis-regulation of online hate speech form a statist approach that deviates from the paradigm protective of equality rights in liberal democracies and serves to fend off open criticism of government policies and public discussion of topics that potentially contravene the mainstream political ideologies.

Whittaker on Corporate Capture of AI

Meredith Whittaker (NYU) has posted “The Steep Cost of Capture” on SSRN. Here is the abstract:

In considering how to tackle this onslaught of industrial AI, we must first recognize that the “advances” in AI celebrated over the past decade were not due to fundamental scientific breakthroughs in AI techniques. They were and are primarily the product of significantly concentrated data and compute resources that reside in the hands of a few large tech corporations. Modern AI is fundamentally dependent on corporate resources and business practices, and our increasing reliance on such AI cedes inordinate power over our lives and institutions to a handful of tech firms. It also gives these firms significant influence over both the direction of AI development and the academic institutions wishing to research it.

Meaning that tech firms are startlingly well positioned to shape what we do—and do not—know about AI and the business behind it, at the same time that their AI products are working to shape our lives and institutions.

Examining the history of the U.S. military’s influence over scientific research during the Cold War, we see parallels to the tech industry’s current influence over AI. This history also offers alarming examples of the way in which U.S. military dominance worked to shape academic knowledge production, and to punish those who dissented.

Today, the tech industry is facing mounting regulatory pressure, and is increasing its efforts to create tech-positive narratives and to silence and sideline critics in much the same way the U.S. military and its allies did in the past. Taken as a whole, we see that the tech industry’s dominance in AI research and knowledge production puts critical researchers and advocates within, and beyond, academia in a treacherous position. This threatens to deprive frontline communities, policymakers, and the public of vital knowledge about the costs and consequences of AI and the industry responsible for it—right at the time that this work is most needed.

Botero Arcila & Groza on the EU Data Act

Beatriz Botero Arcila (Sciences Po Law; Harvard Berkman Klein) and Teodora Groza (Sciences Po Law) have posted “Comments to the Data Act from the Law and Technology Group of Sciences Po Law School” on SSRN. Here is the abstract:

These are comments submitted by members of the Law and Technology Group of Sciences Po Law School to the Proposal for a Regulation of the European Parliament and the Council on harmonised rules on fair access to and use of data (“Data Act”), open for open consultation and feedback from stakeholders from 14 March to 13 May 2022.

We welcome the Commission’s initiative and share the general concern and idea that data concentration and barriers to data sharing contribute to the concentration of the digital economy in Europe. Similarly, based on our own research we share the Commission’s diagnosis that legal and technical barriers prevent different actors to enter in voluntary data-sharing agreements and transactions.

In general, we believe the Data Act is a good initiative, that will flexibilize some of the barriers that exist in the European market to facilitate the creation of value from data by different stakeholders, and not only those who produce it. In this document, however, we focus on five key clarifications that should be taken into account to further achieve this goal: (1) relieving the user from the burden the “data-sharing” mechanism, as this mechanism may be asking users to act beyond their rational capabilities; (2) the definition of the market as the one for related services fails to unlock the competitive potential of data sharing and might increase concentration in the primary markets for IoT devices; (3) service providers need to nudge users into sharing their data; (4) the difficulty of working with the personal – non personal data binary suggested by the act; and (5) the obligation to make data available to public sector bodies sets a barre that may be too hard to meet and may hamper the usefulness of this provision.

Desai & Lemley on Scarcity, Regulation, and the Abundance Society

Deven R. Desai (Georgia Institute of Technology – Scheller College of Business) and Mark A. Lemley (Stanford Law School) have posted “Scarcity, Regulation, and the Abundance Society” on SSRN. Here is the abstract:

New technologies continue to democratize, decentralize, and disrupt production, offering the possibility that scarcity will be a thing of the past for many industries. We call these technologies of abundance. But our economy and our legal institutions are based on scarcity.

Abundance lowers costs. When that happens, the elimination of scarcity changes the economics of how goods and services are produced and distributed. This doesn’t just follow a normal demand curve pattern – consumption increases as price declines. Rather, special things happen when costs approach zero.

Digitization and its effects on the production, organization, and distribution of information provide early examples of changes to markets and industries. Copyright industries went through upheaval and demands for new protections. But they are not alone. New technologies such as 3D printing, CRISPR, artificial intelligence, synthetic biology, and more are democratizing, decentralizing, and disrupting production in food and alcohol production, biotechnologies, and more, and even the production of innovation itself, opening the prospect of an abundance society in which people can print or otherwise obtain the things they want, including living organisms, on-demand.

Abundance changes the social as well as economic context of markets. How will markets and legal institutions based on scarcity react when it is gone? Will we try to replicate that scarcity by imposing legal rules, as IP law does? Will the abundance of some things just create new forms of scarcity in others – the raw materials that feed 3D printers, for instance, or the electricity needed to feed AIs and cryptocurrency? Will we come up with new forms of artificial scarcity, as brands and non-fungible tokens (NFTs) do? Or will we reorder our economics and our society to focus on things other than scarcity? If so, what will that look like? And how will abundance affect the distribution of resources in society? Will we reverse the long-standing trend towards greater income inequality? Or will society find new ways to distinguish the haves from the have-nots?

Society already has examples of each type of response. The copyright industries survived the end of scarcity, and indeed thrived, not by turning to the law but by changing business practices, leveraging the scarcity inherent to live performances and using streaming technology to remove the market structures that fed unauthorized copying, and by reorganizing around distribution networks rather than content creators. Newsgathering, reporting, and distribution face challenges flowing from democratized, decentralized, and disrupted production. Luxury brands and NFTs offer examples of artificial scarcity created to reinforce a sort of modern sumptuary code. And we have seen effective, decentralized production based on economics of abundance in examples ranging from open-source software to Wikipedia.

In this introductory essay, we survey the potential futures of a post-scarcity society and offer some thoughts as to more (and less) socially productive ways to respond to the death of scarcity.