What happens when an AI joins the board of directors?

Are such things a gimmick or an indispensable tool for company strategy and decision-making?

Generative AI (GenAI), which creates new, original content based on what it has been fed or trained on, is the latest entry on the innovation list
Generative AI (GenAI), which creates new, original content based on what it has been fed or trained on, is the latest entry on the innovation list

Although artificial intelligence (AI) is a handy workplace productivity tool most people don’t consider it a trusted colleague or a strategic senior adviser. Nobody is inviting new hire GenAI out for work drinks or to sing at the staff Christmas party; at least not yet.

Ten years ago though, one future-focused company added an AI to their board of directors. Hong Kong based venture capital firm Deep Knowledge Ventures formally appointed an algorithm to its board, even giving it voting rights on investment decisions, in 2014.

Back then, it simply crunched quantitative data and made recommendations that were then debated by the board members. The AI’s contributions were more gimmicky tool than true board member consultation.

Even so, last decade’s novelty trick often becomes the next era’s defining technology. Innovations that were once the stuff of science fiction – lasers, medical scanning devices, driverless cars – are now part of everyday lives.

READ MORE

Generative AI (GenAI), which creates new, original content based on what it has been fed or trained on, is the latest entry on the innovation list. The most popular ones for business use include: OpenAI, ChatGPT, Microsoft 365 Copilot, Google Gemini, Meta AI, Perplexity AI and Claude.

GenAI is being rapidly, almost unquestioningly, adopted in everything from customer service and scientific discovery to elder care and education. So, should boards jump on the bandwagon and employ virtual AI board members?

Will the Government’s new plan speed up the delivery of vital infrastructure projects?

Listen | 38:18

Shiny new toys are alluring but boards need to ask themselves why they want to use it. Is there a clear business case for using AI and a justifiable return on investment for the business and the board? Or are you motivated mainly by a fear of being left behind?

There’s certainly a need for board productivity tools. The human kind of board member has massive legal, ethical, moral and financial responsibilities to their organisation yet they work only a few hours per month and have to process vast amounts of information.

In addition, they are increasingly expected to be up-to-date on all the latest technologies, governance updates, geopolitical and economic risks and opportunities.

Enter AI. Today’s AI systems can do more than just analyse vast quantities of data, recognise patterns and make predictions. When used properly, GenAI can help board members better prepare for and engage in substantive, well-informed, and strategically focused discussions with management and their board colleagues, says Professor Stanislav Shekshnia of INSEAD who has been studying its effectiveness for boards.

EU opens investigation into Google’s use of online content for AI modelsOpens in new window ]

AIs can be advisory tools, helping founders and human boards gather information, perform market research and assist in strategic planning. Scenario planning is a particularly effective use of this technology when entering new markets or making complicated decisions.

AI tools can even anticipate questions from human members and investors, analyse board interactions, behaviours and performance as well as scan for blind spots.

But, do you really want it listening in and making strategic decisions? Does that align with your obligations as a non-executive director (NED)?

Board directors are responsible for every aspect of governance and compliance, including AI usage
Board directors are responsible for every aspect of governance and compliance, including AI usage
Risk and responsibilities

Board directors are responsible for every aspect of governance and compliance, including AI usage. Boards must ensure that they have an approved AI policy in place that addresses acceptable usage in the organisation and risk issues such as ethical AI, cyber security and GDPR.

Under the European Union AI Act directors have specific requirements including that staff operating AI systems have sufficient AI literacy and appropriate training. That includes board members too.

“Boards should not accept that people can’t understand the technology. Just because it’s new doesn’t mean that members shouldn’t be obliged to understand it. You have a responsibility to understand it. Chairs, working with the company secretary, have a leadership role in making sure board members fully understand the implications of the introduction of this kind of tech into the board or the organisation,” says Caroline Spillane, chief executive of the Institute of Directors Ireland.

As AI becomes more advanced, there is discussion about its potential role in governance and sitting in on meetings as an observer and analyst, but this is still in the early stages of development and many troubling legal questions remain. In most jurisdictions, AIs cannot be legally appointed as a formal director of the board.

Top consultancies freeze starting salaries as AI threatens ‘pyramid’ modelOpens in new window ]

However, Professor Shekshnia found that some of the 50 international board directors he interviewed have been using AI for some tasks including “finding additional information about the company, the competition, and items on the board agenda. Another frames board discussion items with the help of AI, and a third uses ChatGPT during board meetings to test assumptions and generate alternatives to proposals made by management.”

This raises so many questions. If an AI is listening to, recording and analysing board meetings are you breaking confidentiality? When preparing for a meeting, if you feed board packs or other information into a publicly available GenAI for analysis, is that information being used to train it? Can the GenAI company or your competitors gain access to this information?

Assurances from AI companies that data is secure is no guarantee as cyber hacking is possible through things like hidden Easter Eggs or prompts that instruct the AI to disclose confidential information or leak sensitive data.

Tool not taskmaster

If you use a GenAI for some tasks are you abdicating your legal responsibilities as a board member? Some AIs are known to have bias against women and people of colour so if you use its advice could you unwittingly be discriminating against them?

An overreliance on GenAI may lead to flawed decision making thanks to humans’ susceptibility to flattery. Some AI chatbots are notoriously sycophantic, or trained to tell you what you want to hear. They may also make up information if they don’t know the answer. We call this BS when a human does it but with an AI it’s called hallucinations.

“It’s inevitable that AI will be brought into boardrooms but boards need to decide where its most valuable and understand that technology that does not replace human judgment,” says Spillane.

“Technology can’t see the human factors when it comes to critical decision-making. It can’t read the room – the mood, the culture – that are all part of collaboration.”

“NEDs play a different role to executive directors. They are supposed to bring independence, impartiality, wide experience, specialist knowledge and personal qualities. Directors cannot abdicate responsibility for reading and understanding the board pack to a technology.”

Paul Halpin, an Irish Chartered Accountant with two decades of international listed company board experience says “An AI reasoning tool can make life easier for creative problem-solvers and decision-makers, but not as a decision-maker in its own right, unless it is following rules that humans have approved, and that humans agree are applicable to the decision that is being made.”

Although companies should take technological and legal precautions to mitigate the risks of using AI, managing risk at the board level is down to becoming a smarter and more well informed user of the tool both individually and collectively.

Margaret E Ward is chief executive of Clear Eye, a leadership consultancy, and an independent non-executive director. margaret@cleareye.ie