Packin on Financial Inclusion Gone Wrong: Securities Trading for Children

Nizan Geslevich Packin (City University of NY, Baruch College, Zicklin School of Business; City University of New York (CUNY) – Department of Law) has posted “Financial Inclusion Gone Wrong: Securities Trading For Children” on SSRN. Here is the abstract:

For the majority of Americans, money is the primary source of anxiety. It is especially tolling for parents and younger adults, with more than three quarters of parents, millennials and Gen Xers frequently experiencing emotional stress about money, and almost 90 percent of Americans believing that nothing could make them happier than knowing that their finances are in order. Looking for ways to help Americans deal with this source of anxiety, some believe that teaching children about financial investments at young ages can help increase financial literacy, and lower people’s money-related anxiety level.

There are many different ways to increase consumers’ financial literacy starting at a young age. Yet, despite the available public and private sector offered resources, and the clear need for caregivers to focus on this issue, studies show that most people do very little, if at all, to increase financial literacy. In recent years, identifying the educational market opportunity and the potential of the children’s financial literacy business niche, FinTech companies and even traditional financial institutions started offering financial services for children, reassured by research that shows that by six years old, children already are veteran consumers of smart device content.

The potential of this new market’s clientele is valuable for two reasons. First, having more customers is always a good thing. Second, these new customers will eventually mature into more traditional adult customers, and presumably, they will continue using the services with which they are familiar. And while there are some legal challenges associated with children not only entering into investment contracts, but also doing so online, this new market will continue to grow, because offering financial services to children is becoming more socially acceptable for Gen Z and Alpha members than ever before. Especially, given society’s newly adopted paradigm for describing, understanding, and shaping children’s rights, domestic relationships and custodial status, and even digital purchasing power.

But although digital financial apps can help educate children about the value of money, the importance of investing, and even the risks of trading, the trend of offering financial services directly to children should be of concern to anyone focused on consumer protection and financial regulation-related issues. Among those, the SEC, its new Chair, and other public figures, which are tasked with regulating these issues, and have raised concerns about the growing interest of financial service providers in younger users, as that interest became more apparent during the January 2021 RobinHood/GameStop stock controversy. Likewise, FINRA, which enables investors and firms to participate in the market with confidence by safeguarding its integrity, announced that it is looking for public feedback on the gamification used by financial service providers since it identified “risks associated with app-based platforms and ‘game-like’ features that are meant to influence customers.”

Using ethical reasoning and behavioral economics tools, this Essay explores several important issues as it suggests regulating FinTech companies’ and other financial institutions’ offerings of digital financial services to children. First, digital gaming is well-known to be addictive for children. Second, gamifying investing makes it feel less serious, not more serious, contravening the very notion that early education will help young adults understand the seriousness of money. Moreover, there is a connection between gamifying and gambling that is especially relevant in connection with gamifying investing. Third, children’s financial choices are more susceptible to the influence of outside, interested (and uninterested) parties. Lastly, parents are already struggling to keep up with supervising their children’s online activities. Enabling children to use digital financial apps will require much more effort on their parents’ part, as there are many things that parents need to be on the lookout for.

Gervais on The Human Cause

Daniel J. Gervais (Vanderbilt University – Law School) has posted “The Human Cause” on SSRN. Here is the abstract:

This paper argues that, although AI machines are increasingly able to produce outputs that facially qualify for copyright or patent protection, such outputs should not be protected by law when they have no identifiable human cause, that is, when the autonomy of the machine is such that it breaks the causal link between the output and one or more human creators or inventors. As a species, normatively we should seek to preserve incentives for human creativity and inventiveness, as these have been hallmarks of the higher mental faculties often used to define humanness. The paper also discussed situation where human and machines work together and how courts can apply the proposed approach.


Ziaja on Algorithm Assisted Decision Making and Environmental Law

Sonya Ziaja (University of Baltimore – School of Law) has posted “How Algorithm Assisted Decision Making Is Influencing Environmental Law and Climate Adaptation” (Ecology Law Quarterly, Forthcoming (48:3, 2021)) on SSRN. Here is the abstract:

Algorithm-based decision tools in environmental law appear policy neutral but embody bias and hidden values that affect equity and democracy. In effect, algorithm-based tools are new fora for law and policymaking, distinct from legislatures and courts. In turn, these tools influence the development and implementation of environmental law and regulation. As a practical matter, there is a pressing need to understand how these automated decision-making tools interact with and influence law and policy. This Article begins this timely and critical discussion.

Though algorithmic decision-making has been critiqued in other domains, like policing and housing policy, environmental and energy policy may be more dependent on algorithmic tools because of climate change. Expectations of climatic stationarity—for example how frequently or severely a coastal area floods or how many days of extreme heat an energy system needs to anticipate—are no longer valid. Algorithm-based tools are needed to make sense of possible future scenarios in an unstable climate. However, dependence on these tools brings with it a conflict between technocracy (and the need to rapidly adapt and respond to climate change) and democratic participation, which is fundamental to equity. This Article discusses sources of that tension within environmental, algorithm-based tools and offers a pathway forward to integrate values of equity and democratic participation into these tools.

After introducing the problem of water and energy adaptation to climate change, this Article synthesizes prior multidisciplinary work on algorithmic decision making and modeling-informed governance—bringing together the works of early climate scientists and contemporary leaders in algorithmic decision making. From this synthesis, this Article presents a framework for analyzing how well these tools integrate principles of equity, including procedural and substantive fairness—both of which are essential to democracy. The framework evaluates how the tools handle uncertainty, transparency, and stakeholder collaboration across two attributes. The first attribute has to do with the model itself—specifically, how, and whether, existing law and policy are incorporated into these tools. These social parameters can be incorporated as inputs to the model or in the structure of the model, which determines its logic. The second attribute has to do with the modeling process—how, and whether, stakeholders and end-users collaborated in the model’s development.

This Article then applies this framework and compares two algorithm-assisted, decision-making tools currently in use for adapting water and energy systems to climate change. The first tool is called “INFORM.” It is used to allocate water quantity and flow on the Sacramento River, while taking climate and weather into account. The second tool is called “RESOLVE.” It is used by energy utility regulators in California to evaluate scenarios for energy generation. Although the development of both tools involved collaborative processes, there are meaningful distinctions in the history of their development and use. The comparisons indicate that how law and policy are incorporated into the underlying code of models influences the development and regulation of climate adaptation, while inclusiveness and collaboration during the model’s development influences the model’s perceived usefulness and adoption. Both conclusions have implications for equity and accessibility of environmental, natural resource, and energy planning.