Board Oversight of AI: Do Boards Need AI Experts?

15 April 2026
View the Debrief

As the use of artificial intelligence (AI) across industries increases rapidly, many boards of directors are considering whether they have the expertise necessary to maintain effective oversight of AI-related opportunities and risks. As the SEC has made clear regarding cybersecurity, boards must find a way to exercise their supervisory obligations, even in technical areas, if those areas present enterprise risks. A frequent question in this context is whether boards should have a director who is an “AI expert.”

In this Debevoise Update, we highlight three considerations for boards evaluating the need for AI expertise in the boardroom.

Adding a Dedicated AI Expert May Present Challenges. While appointing a director with AI expertise may be appealing, it can present practical and governance challenges. First, the pool of individuals with both deep AI expertise and the qualifications to serve effectively as a public company director is limited. Second, the percentage of companies for which AI is so fundamental to their business that it requires an AI expert on the board is very small. The appointment of a director with AI expertise could raise questions about a lack of specific board expertise covering other areas of potential enterprise risk (e.g., such as cybersecurity, political or environmental risks).

Third, the presence of a designated expert may inadvertently undermine effective board dynamics. For example, other directors may defer excessively to an AI expert, reducing the level of constructive challenge and debate that is critical to effective oversight. This dynamic can undermine the collective decision-making that is at the heart of board function and weaken the board’s ability to independently assess management’s approach to AI. Over time, concentrating AI knowledge in a single director may also reduce other directors’ incentives to learn about AI, which is likely to become increasingly important in the future.

Finally, individuals with deep AI expertise often have extensive experience in the technology industry and may have conflicts of interest, such as investments in AI companies or commercial relationships with vendors, which would require careful management.

These challenges, while real, should not dissuade companies from adding AI expertise to their boards if they identify candidates with relevant experience and other attributes that will make them effective directors.

Boards May Rely on Management and Outside Experts. Under the corporate law of most states, directors are permitted to rely on management and experts with respect to certain matters. For example, Section 141(e) of the Delaware General Corporation Law states in part that a director shall be “fully protected” in relying in good faith on information, opinions, or reports from corporate officers, employees, and experts who have been selected with reasonable care by or on behalf of the company as to matters that the director reasonably believes are within such persons’ professional or expert competence.

Directors of Delaware corporations therefore often rely on experts when exercising oversight in fulfilment of their fiduciary duties. This statutory framework provides directors of Delaware corporations with a defense to liability in duty of care cases if, in their capacity as directors, they rely in good faith on expert advice. Accordingly, directors can rely on members of management, such as a Chief Technology Officer or someone serving a similar function, to provide updates on the company’s use of AI, as well as AI-related the risks and opportunities facing the company.

Management and Outside Advisors Can Supplement Board Expertise. Directors can also support their oversight of AI by receiving appropriate education and regular reporting on the company’s use of AI, associated risks, and governance practices. This may include periodic updates from management on AI use cases, risks, and governance frameworks; engagement with executives responsible for technology and risk management; and creation of management-level, cross-functional committees focused on AI governance.

Legal, consulting, and technical advisors can also provide board-level education and assist boards in understanding emerging legal and regulatory developments, evaluating governance frameworks and practices, assessing risks associated with specific AI use cases, and identifying opportunities to use AI in the business.

Final Thoughts. All directors should, at a minimum, possess baseline AI literacy. As AI becomes more integral to the operations of many companies, investors increasingly expect disclosure of relevant board skills and expect companies to establish appropriate oversight processes to manage evolving AI-related risks and opportunities.

Against this backdrop, companies may wish to use proxy disclosures to articulate how they are addressing AI and other technology-related risks. Companies may also consider whether responsibility for AI oversight should be formally allocated to an existing committee (such as audit or risk) or a dedicated technology committee. The appropriate AI-related oversight framework will differ for each company and should align with the company’s strategic needs and relationship to technology.

Clear disclosure of the board’s oversight framework, use of internal and external expertise, and approach to AI governance can help shareholders better understand the company’s risk management processes and strategic positioning.

* * *

Debevoise’s Chambers-ranked artificial intelligence practice regularly advises companies and boards on the adoption of AI governance policies and assessing AI risk. Please do not hesitate to contact us with any questions.

 

This publication is for general information purposes only. It is not intended to provide, nor is it to be used as, a substitute for legal advice. In some jurisdictions it may be considered attorney advertising.