Public sector embraces AI, but governance gaps remain
- Public sector employees using open-source AI
- Widespread confusion and uncertainty about AI governance
Public sector organisations should implement fit-for-purpose governance frameworks for the use of artificial intelligence (AI), recommends Australia’s largest accounting body, CPA Australia.
A new report from CPA Australia on AI Governance in the Public Sector found that generative AI tools such as ChatGPT and Copilot are widely used in government departments and agencies. However, sound governance protocols are often lacking, or employees are unaware of them.
A webinar poll hosted by CPA Australia suggested that 40 per cent of public sector employees are accessing publicly available AI tools to do their work, while only 24 per cent are using AI tools provided by their employer.
Only 30 per cent of respondents indicated that their organisation has a policy or has provided guidance on the use of AI tools at work. Meanwhile, 23 per cent said their organisation is in the process of crafting one. Alarmingly, 28 per cent reported that no such policy exists and 10 per cent were unaware whether a policy exists.
Gavan Ord, CPA Australia’s Business Investment Lead, said that while AI has the potential to provide substantial benefits, its adoption in the public sector presents an array of risks, particularly where data sensitivity, transparency and accountability are paramount.
“There appears to be a widespread lack of clarity and guidance in the public sector about which AI tools employees can, can’t, should and shouldn’t be using, as well as how to use them meaningfully,” said Mr Ord.
“This confusion and uncertainty is problematic, especially in agencies where data security is critical as open-source or external tools may result in unintentional data exposure. It only takes one person in one department making one mistake because of a lack of appropriate controls – or knowledge of them – to cause a major issue. Currently, it appears that existing controls are insufficient or not well understood.”
Mr Ord said the poll findings, albeit from a modest sample size, are concerning. The webinar had 976 live attendees who were invited to take part in the poll, with sample sizes varying between questions.
“The public sector urgently needs a fit-for-purpose AI governance framework that goes beyond compliance and provides clarity, confidence and ethical guidance to users. This should very clearly determine which tools are permitted and under what conditions, as well as how outputs should be reviewed and validated to ensure the appropriate risk management, privacy and accountability.”
The webinar hosted by CPA Australia was in response to a recent report from the Australian National Audit Office (ANAO) that highlighted potential flaws in public sector governance of AI tools and provided recommendations to address them.
Mr Ord said the tendency of some users to view AI-generated content as a validated source without critical analysis was another concerning outcome from the poll.
“AI tools can improve the quality of decision making, however an overreliance on AI tools could ultimately undermine those decisions,” he said. “AI tools are best used as a means to reduce the time spent on repetitive tasks, freeing up individuals to focus their efforts on the more complex and impactful duties of their role,” he said.
“Educating public sector employees on how to use AI tools efficiently and effectively should be another vital aspect of its governance.”
The governance of AI in the public sector is the topic of one of several events at CPA Australia’s Government Finance Business Partners Forum to be held at the QT Hotel in Canberra this Thursday and Friday (July 24-25). The session will be presented by Tony Krizan FCPA, former CFO at the National Health and Medical Research Council. A full program of events and keynote speakers can be found here.
Media contact
Simon Downes, External Affairs Lead
[email protected]
+61 0401 461 503