Lawyers might wonder how President Biden’s recent Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence affects the legal profession. The executive order aims to promote the development and adoption of AI across various sectors in the federal government and private sectors related to responsible development and equitable use of AI. This will drive demand for legal expertise. Lawyers can expect many more opportunities and challenges associated with AI in their work.
Key aspects of Biden’s order that will require legal guidance include:
Regulating AI Development and Disclosures. Companies developing AI will be required to report their development of AI models and submit the safety results of their red teaming (troubleshooting). The National Institute of Standards and Technology (NIST) will develop the safety standards. There is a push in the executive order to promote resources, standards, and guidelines to develop trustworthy AI. When the new standards from the NIST emerge for AI systems, lawyers assisting clients in developing or utilizing AI tools may need to ensure their clients align with the NIST’s guidelines.
Algorithmic Discrimination Audits. The order directs federal agencies to perform regular equity assessments of AI systems and investigate complaints of algorithmic discrimination, necessitating legal oversight. The order promotes guidelines to investigate and prosecute any AI-related civil rights violations and the creation of best practices for using AI in the criminal justice system to enhance its fairness and efficiency. It also calls for combating discrimination and biases against protected groups in consumer and financial housing markets. This will prompt more scrutiny and potential regulation around the use of AI in these areas and generate litigation.
Vendor AI Compliance. Government contractors providing AI products and services must meet new model risk management requirements that legal teams will need to help navigate.
Transparency Standards. Strict new transparency standards for datasets and models will require legal review to properly implement without compromising protected information.
AI Ethics Framework. Adopting a new federal AI ethics framework with criteria like fairness and safety will require legal advice to actualize and enforce.
Automated Systems Regulation. Potential new federal regulations governing automated systems could lead to major regulatory compliance issues that legal counsel will be tasked with reconciling.
Cybersecurity & Data Privacy. A strong emphasis on cybersecurity and privacy will drive the need for legal expertise in data protection and AI system security.
Immigration. The executive order will result in updated, eased criteria for immigration visas favoring artificial intelligence experts so they can study or work in the U.S.
This sweeping AI order ushers in a new era of scrutiny around responsible AI development and usage with the goal of making AI more safe and secure. Entities integrating AI into federal systems or contracting will urgently seek legal guidance to unravel the executive order’s extensive new requirements. The NSIT will take the lead in developing guidelines and best practices for “developing and deploying safe, secure, and trustworthy AI systems.” The NIST’s guidelines may one day be the industry standard, so it is imperative that lawyers stay updated and inform their clients about the new guidance and regulations on AI. Lawyers will engage with federal agencies to set the standards and rules and assist their clients in this growing field. Although the deadlines set in the executive order will inherently delay the implementation of the proposed guidelines until 2025, the robust ethics boundaries, audits, reporting requirements, and transparency will spur demand for expert legal counsel. Law firms may want to consider training their legal professionals on issues surrounding AI governance, ethics, innovation, and implementation in this growing field as a new specialty in the law is emerging.