Current

    How can organisations develop ethical frameworks to govern the complexities of artificial intelligence (AI?) At a recent AICD event, a panel discussion explored the emergence and popularity of AI and the risks, opportunities and questions that the new technology raises.


    As the use of AI becomes more prolific in all areas of life, including in the boardroom, questions of ethical use are being raised to understand how to best optimise its use and, in sectors such as education, to define the distinct line of what is and what is not original work, and how this will interplay with legal duties and obligations.

    “Ethical decision making is a complex area, because not many of us are actually trained in AI ethics,” Dr Richard Dammery FAICD, Non- Executive Director WiseTech Global Ltd, told the recent AICD event.

    “Regrettably, it often becomes quite subjective quite quickly. So as the world becomes more complex, AI is certainly going to make organisations and the world more complex.”

    Dammery told the event that adjusting to the technology will take time, commitment and familiarity, and not being afraid of technology because of the jargon or complexity, as in the case of AI mathematics. “Reading into this and continuing to learn will be crucial for all directors.”

    Deena Shiff FAICD, Non-Executive Director ProMedicus, said, “It is impressive how quickly some universities have moved from treating the issue as one of plagiarism to one of training students to explain the use of Chat GPT in their research and to use it benignly for literature reviews.

    “There are a number of ethical frameworks now published both in Australia and elsewhere that can be adopted as policy guides. However, the critical issue is how to operationalise these principles.”

    What AI means for directors

    The 2023 Victorian Annual Dinner in November provided AICD members with a practical discussion by a panel of experts who shared their views on the complex world of AI. The panel unpacked questions about what AI really means for Australian directors and how to embrace the significant opportunity presented by AI, while responsibly managing risk and Australia’s competitive advantage in this trillion-dollar industry.

    Dozens of important questions relating to AI and its effects on Australian business were discussed.  These included:

    • What ethical considerations will ensure AI’s integration aligns with our organisational values and societal responsibilities?
    • How do you envision AI shaping governance and decision making?
    • What kind of variation in approaches are you seeing when it comes to AI across different industries and/or use cases?

    Panellists also discussed the transformative effects, opportunities and challenges of AI and what’s ahead for them and their respective boards.

    Panel moderator Stela Solar, Director National AI Centre CSIRO, said, “AI is not a new thing. It’s been around since the 1950’s. In fact, in Sydney the transport and infrastructure teams have used machine learning, which is a subset of AI, for 40 years. But generative AI has changed the game. I think it’s going to be celebrated because it’s so easy to use.

    “From the board discussions that we’ve had, AI seems to be quite often too risky,” she added. “But what we’ve found is that when you focus on risk, you quite often can miss the transformational opportunity, and it really affects the kind of discussions that the board can have in organisations.

    “So, there’s an opportunity to position AI as an opportunity-centric way from realising, yes, there is risk. That’s the reality that leaders of organisations around the world are facing. It’s always been about risk and opportunity in business, and AI is not dissimilar to that.”

    Interestingly, Solar cited a hidden statistic, claiming that 30 to 40 per cent of employees are now using generative AI in the workplace, and 68 per cent of them are not telling anyone about it. This is colloquially referred to as ‘shadow AI’. So, the option for risk management may be more difficult than it seems.

    Machine learning has existed in a number of different contexts for some time, so for organizations, it’s not new. But now we are at another stage of evolution, and for those who haven’t been using it, it’s really been a wake-up call, added Dammery.

    “I think it’s a terrible idea to approach any discussion about technology and innovation through your risk leaders, because it’s going to kill the conversation before it's even started,” he added.

    Scarlett McDermott, Head of Ecosystem Capability Technology Council of Australia questioned what the potential risks are and the impacts to society.

    To connect with leading minds in governance, visit our website to register for future events.

    Latest news

    This is of of your complimentary pieces of content

    This is exclusive content.

    You have reached your limit for guest contents. The content you are trying to access is exclusive for AICD members. Please become a member for unlimited access.