Leadership

Board AI Literacy: The New Fiduciary Duty

Published: April 7, 2026

April 7, 2026
5 min
By Tommy Kenny

Board AI Literacy: The Governance Gap That's Creating Liability

Published: April 7, 2026
Reading time: 5 minutes
Category: Leadership

Board AI Literacy


Here's a number that should concern every director: 62% of boards now hold regular AI discussions. Only 27% have formally added AI governance to their committee charters.

That gap — between talking about AI and actually governing it — is where liability lives.

As WilmerHale noted in their January 2026 governance alert: "While many boards have begun to address AI risk in some fashion, surveys indicate that only a minority have adopted formal governance frameworks or established clear metrics for oversight."

If you're a board member in 2026, AI literacy isn't optional. It's fiduciary.


The Numbers Don't Lie

McKinsey's recent research is stark: only 15% of boards currently receive AI-related metrics. That means 85% of boards are overseeing AI investments, deployments, and risks without any formal way to measure what's actually happening.

Meanwhile, PwC's survey of directors found that only 35% of boards have integrated AI into their oversight activities. The other 65% are doing what, exactly? Hoping for the best?

And here's the kicker from The Conference Board: 38% of US CEOs identify AI as the leading factor that could negatively affect their business in 2026 — ranking it above political polarization and consumer shifts.

Your CEO thinks AI is the biggest threat. Your board has no formal governance. See the problem?


The D&O Liability Question

As Techné AI's recent analysis made clear: "Boards that act now to establish documented AI governance protect their directors from personal liability, position their companies for favorable D&O terms, and create a defensible record of fiduciary diligence."

The inverse is also true. Boards that don't govern AI are creating personal liability for their directors. When (not if) something goes wrong — an AI system causes harm, violates regulations, or destroys value — the first question will be: "What did the board know, and what did they do about it?"

"We didn't really understand it" is not a defense. It's an admission.


What Board AI Literacy Actually Means

Let's be practical. Nobody expects every director to become a machine learning engineer. But in 2026, every director needs to understand:

1. Where AI Is Deployed and Why

Which business functions use AI? What decisions does AI influence or make? What's the materiality of those decisions?

Most boards can't answer these questions. That's the first gap to close.

2. The Risk Categories

AI creates novel risk categories that traditional governance frameworks don't cover:

  • Model risk: Is the AI doing what we think it's doing?
  • Data risk: Where does training data come from? Is it biased? Is it legal?
  • Vendor risk: Who built this AI, and what happens when they update it?
  • Regulatory risk: Which jurisdictions consider this AI "high-risk"?
  • Reputational risk: What happens when this AI makes a mistake publicly?

3. The Metrics That Matter

EY's Lee Henderson wrote in CIO Dive: "Forward-looking directors recognize that their role extends beyond defensive postures, and strive to offer strategic guidance to help management keep pace with a rapidly shifting landscape."

That strategic guidance requires metrics:

  • AI investment vs. realized value
  • Error rates and incident reports
  • Regulatory compliance status by jurisdiction
  • Employee AI adoption and capability levels
  • Competitive positioning vs. industry benchmarks

4. The Questions to Ask

A literate board knows what to ask even when they don't know the technical answers:

  • "Walk me through a decision this AI made and show me the reasoning."
  • "If this system fails, who gets hurt and how badly?"
  • "What's our exposure under the EU AI Act / state regulations?"
  • "How do we know this AI isn't making biased decisions?"
  • "What happens to this system if the vendor goes under?"

The INSEAD Model

INSEAD recently launched an "AI for Boards" executive education program specifically addressing this gap. Their framework emphasizes that board AI literacy isn't about technical depth — it's about:

  1. Strategic understanding: How AI changes competitive dynamics
  2. Risk awareness: What can go wrong and how to detect it
  3. Governance capability: How to oversee what you can't personally build
  4. Talent judgment: How to evaluate whether management is AI-capable

You don't need to code. You need to govern.


The Three-Step Fix

Step 1: Audit Where You Are (This Month)

Have your general counsel or outside advisor survey:

  • Every AI system currently deployed
  • The decision-making authority of each system
  • The governance documentation that exists (or doesn't)
  • The metrics currently reported to the board

Most boards discover they know far less than they assumed.

Step 2: Formalize the Charter (This Quarter)

Add explicit AI governance to your audit committee or risk committee charter. Define:

  • What requires board approval
  • What requires committee review
  • What gets reported and at what frequency
  • Who owns AI governance at the management level

Documentation creates defensibility.

Step 3: Build the Muscle (Ongoing)

Schedule board education sessions — not vendor demos, actual education. Use frameworks like INSEAD's or bring in advisors who can translate technical concepts into governance implications.

Most importantly: ask questions. Every meeting. Relentlessly.


The Cost of Inaction

The next few years will see AI incidents — algorithmic bias lawsuits, regulatory enforcement actions, catastrophic model failures. When regulators and plaintiffs look for accountability, they'll examine board minutes, governance structures, and oversight practices.

Boards that can demonstrate literacy, oversight, and diligence will be defensible.

Boards that can't will become cautionary tales.

Which one do you want to be?


Your Monday Action

Before your next board meeting, ask your CEO three questions:

  1. What AI systems are currently deployed in the company?
  2. What decisions do those systems influence?
  3. What metrics do we have on their performance and risk?

The answers — or lack of answers — will tell you everything about your current governance gap.


Tommy Kenny is a business attorney, fractional executive, and executive coach who advises boards on governance and digital transformation. Subscribe to Digital Executive Insight for weekly frameworks that actually work.


Related Posts:


Sources

  1. WilmerHale. (2026). "Board Oversight and Artificial Intelligence: Key Governance Priorities for 2026." January 22.
  2. Knostic. (2025). "The 20 Biggest AI Governance Statistics and Trends of 2025."
  3. PwC. (2026). "2026 Corporate Governance Trends: Five Priorities for Directors."
  4. McKinsey. (2025). "Elevating Board Governance Through AI Posture and Archetypes." December 4.
  5. The Conference Board. (2026). "AI and the C-Suite: Implications for CEO Strategy in 2026."
  6. EY Americas Center for Board Matters. (2026). "How AI Governance Can Help Boards Boost Value Creation." CIO Dive, February 19.
  7. INSEAD Knowledge. (2026). "A Revolution in Governance: How AI Will Make Boards More Effective."
  8. Techné AI. (2026). "AI Governance and D&O Liability: What Every Board Needs to Know."

Executive Takeaways

This article covers key insights for leadership. Apply these frameworks to drive measurable results in your organization.

Get More Executive Insights

Weekly briefings with frameworks like this one. Join 15,000+ executives.

No spam
Unsubscribe anytime

Continue Reading