Packin & Smith on ESG, Crypto, And What Has The IRS Got To Do With It?

Nizan Geslevich Packin (Baruch College, Zicklin School of Business; CUNY Department of Law) and Sean Stein Smith (CUNY) have posted “ESG, Crypto, And What Has The IRS Got To Do With It?” (Stanford Journal of Blockchain Law & Policy: Forthcoming) on SSRN. Here is the abstract:

Regulation almost always lags behind innovation, and this is also the situation with many FinTech-based products and services, and particularly those offered by crypto industry players. The crypto sector is a new and innovative one, which has proven to be not only based on highly technical concepts, but also by high levels of volatility and financial risk. In attempting to understand how to address many of the issues it raises in legal fields ranging from to financial regulation such as tax requirements to environmental law, and specifically matters that relate to climate change and energy wasting, regulators often find themselves trying to implement existing legal frameworks rather than creating new, clear rules. Much has been written about the SEC’s regime of regulation by enforcement of the crypto industry, and the impact of this type of rulemaking on businesses and persons. However, other financial regulators adopting a similar style of rulemaking—such as the IRS—have gotten much less attention for the impact of their regulatory actions. As noted within this Article, prominent industry associations continue to push back against applying existing tax law and protocols to specific crypto activities. One such notable example, which is relevant in the Environmental, Social and Governance (ESG) awareness era, relates to the unintended consequences of the IRS’ regulation by enforcement, given the impact that such new rules have on the transition to greener energy. This happens in the crypto industry in connection with proof-of-stake (PoS) consensus mechanisms—one of the two prominent transaction validation mechanisms. The PoS mechanism includes staking rewards – reward tokens that are earned/generated from securing a PoS blockchain – that validators, also known as stakers, get when they validate transactions. The IRS has asserted that cryptocurrencies are considered property for income taxation purposes, which means that every transaction results in a gain or loss equating to the difference between the price of the crypto asset at purchase and the price of the sale.
In the case of the PoS mechanism’s staking rewards, the debate behind deciding whether the rewards should be classified as taxable income when received versus when sold can also be framed as either applying existing tax code, word for word, to what many consider a new asset category, or eventually changing the code to reflect new economic models supported by some industry actors. This issue, which might seem minor, has recently become the subject of a lawsuit that one validator brought against the IRS, arguing that staking rewards should be taxed at the time they are sold, rather than created as the IRS has argued. Although just one specific court case with limited implications to other taxpayers, it is illustrative of the frustration felt by some market participants. In early October 2022, the case in question was dismissed by the court. Much like the Department of Justice several months earlier, the court found that Jerret presented no case or controversy, as it was moot. The reason for this is that the case had no issue that remained unsettled because the IRS had issued a full refund, including the interest, as was initially requested by the taxpayers. Ultimately this ongoing debate and conversation highlights the following: under strict application of existing tax rules, staking rewards are taxable when received, but should that be the case? Practically speaking, following the dismissal of the Jerret case, the immediate consequence is that taxpayers should assume that staking income is taxed as income at the time of receipt, unless explicitly excluded by the IRS in future tax guidance. That said, this Article argues that having legal clarity is important and that regulation by enforcement is less equitable. Additionally, this Article argues that proper lawmaking is especially needed in this PoS-based staking situation, given the importance of incentivizing persons to use PoS mechanisms due to environmental considerations, and the implications of financial regulation’s nudges on the behavior of persons and the promotion of ESG-based goals. Indeed, this clarity is needed because current legal ad hoc structures do not have the capacity to keep pace with the negative impact on the environment and the energy consumption issues that the crypto sector causes and also prompted the Ethereum merge. It is clear, therefore, that we need additional behavioral incentives for the law to rely on to support and facilitate the objectives of greener environment goals’ promotion.

Martin & Parmar on What Firms Must Know Before Adopting AI

Kirsten Martin (Notre Dame)) and Bidhan Parmar (U Virginia – Darden School of Business) have posted “What Firms Must Know Before Adopting AI: The Ethics of AI Transparency” on SSRN. Here is the abstract:

Firms have obligations to stakeholders that do not disappear when managers adopt AI decision systems. We introduce the concept of the AI knowledge gap – where AI provides limited information about its operations while the stakeholder demands for information justifying firm decisions increase. We develop a framework of what firms must know about their AI model in the procurement process to ensure they understand how the model allows a firm to meet existing obligations including the anticipated risks of using the AI decision system, how to prevent foreseeable risks, and have a plan for resilience. We argue there are no conditions where it is ethical to unquestioningly adopt recommendations from a black box AI program within an organization. According to this argument, adequate comprehension and knowledge about an AI model is not a negotiable design feature but a strategic and moral requirement.

G’sell on AI Judges

Florence G’sell (Science Po Law) has posted “AI Judges” (Larry A. Dimatteo, Cristina Poncibo, Michal Cannarsa (edit.), The Cambridge Handbook of Artificial Intelligence, Global Perspectives on Law and Ethics, Cambridge University Press, 2022) on SSRN. Here is the abstract:

The prospect of a “robot judge” raises many fantasies and concerns. Some argue that only humans are endowed with the modes of thought, intuition and empathy that would be necessary to analyze or judge a case. As early as 1976, Joseph Weizenbaum, creator of Eliza, one of the very first conversational agents, strongly asserted that important decisions should not be left to machines, which are sorely lacking in human qualities such as compassion and wisdom. On the other hand, it could be argued today that the courts would be wrong to deprive themselves of the possibilities opened up by artificial intelligence tools, whose capabilities are expected to improve greatly in the future. In reality, the question of the use of AI in the judicial system should probably be asked in a nuanced way, without considering the dystopian and highly unlikely scenario of the “robot judge” portrayed by Trevor Noah in a famous episode of The Daily Show. Rather, the question is how courts can benefit from increasingly sophisticated machines. To what extent can these tools help them render justice? What is their contribution in terms of decision support? Can we seriously consider delegating to a machine the entire power to make a judicial decision?

This chapter proceeds as follow. Section 23.2 is devoted to the use of AI tools by the courts. It is divided into three subsections. Section 23.2.1 deals with the use of risk assessment tools, which are widespread in the United States but highly regulated in Europe, particularly in France. Section 23.2.2 presents the possibilities opened by machine learning algorithms trained on databases composed of judicial decisions, which are able to anticipate court decisions or recommend solutions to judges. Section 23.2.3 considers the very unlikely eventuality of full automation of judicial decision making.