What should company directors be doing about Artificial Intelligence - A directors' guide to AI governance
11 October 2024
Artificial Intelligence (AI) is everywhere. From workplace desktops and email filters, to search engines and tools for drafting to the social media at home, AI has become integral to our daily lives. It powers facial recognition on smart phones, facilitates access to bank accounts, and enables banks to analyse spending patterns to detect fraud.
With AI becoming increasingly pervasive, boards must understand not only how to leverage AI for their business, but also how to implement appropriate safeguards to ensure its safe and responsible use.
It is well understood that directors have common law and statutory duties to their company. These include:
- Acting with the degree of due care and diligence in exercising their powers and carrying out their functions as a director, that a reasonable person would exercise.
- Exercising and discharging their duties in good faith, in the best interests of the company, and for a proper purpose.
Fulfilling these duties is an evolving challenge, as is the lens in respect of which they are to be interpreted. The rapid uptake of AI means it must now be on the agenda for every board.
Helpfully, the Australian Institute of Company Directors (AICD), in partnership with the Human Technology Institute at the University of Technology Sydney has published a suite of resources to help boards navigate the ethical and informed use of AI.
The suite comprises three key parts.
- A Director’s Introduction to AI: This guide helps directors understand key AI concepts, risks and obligations. The introduction contains three chapters:
-
An introduction to what AI is, how it's used and its relevance for directors.
- Opportunities and risks of using AI.
- An examination of the regulatory obligations in Australia and overseas relating to AI systems.
- A Director’s Guide to AI Governance: This practical guide helps directors, particularly those in ASX300 entities, navigate the integration and deployment of AI within their organisation. It is recognised that AI is fast moving and the guide offers a framework for board oversight of the use of AI. The guide contains two sections:
- Insights and implications related to AI governance for directors.
- Elements of effective, safe and responsible AI governance – offering questions and tools to drill deeper, including case studies.
- A governance checklist for SME and NFP directors: This checklist outlines recommended steps for AI governance, tailored to smaller businesses and not-for-profit entities.
Summary of recommendations
The AICD guide provides a valuable starting point for boards uncertain about how to approach AI governance. No tool can serve as a one-size-fits-all solution, especially in such a rapidly evolving landscape as AI—consider that Chat GPT, the generative AI tool that changed the game, was only released by OpenAI in November 2022.
As mentioned, AI should be on the agenda of every board in Australia. This tool provides a helpful resource but also gives boards a reference point from which to begin their journey into using and managing AI.
Perhaps the best way to digest this beneficial resource is to briefly examine the eight elements of safe and responsible AI governance.
Key takeaway
The AI Governance Guidance is not meant to be comprehensive, but rather aims to provide boards with foundational knowledge of AI and a suggested framework for oversight of its use.
In our next article in this series, we will explore how boards should address upcoming changes in privacy law, some of which are imminent and others further off. We will also examine the intersection of AI, privacy and in particular automated decision making (ADM). You can also refer to our article on the recently proposed privacy law changes.