The impacts of artificial intelligence (AI) are already far-reaching, and they continue to grow.
At a macro-level, AI is already affecting many aspects of the world we are living in; from the economy through to healthcare, science and planning – just to name a few.
But at a micro-level, AI is impacting our own personal wellbeing – the ways we work, learn and communicate for example.
AI has brought about both opportunities and risks across many parts of society, including the not-for-profit and charity sectors.
Whether your charity is an early adopter and already using AI, planning to do so, or just considering how AI might help it in the future, any decision to harness AI is a strategic one that should not be taken lightly.
Charities need to ensure any risks associated with AI use that are specific to their work are properly managed.
What is AI?
AI is a multidisciplinary field that simulates human intelligence in machines.
AI uses large volumes of data to build machines that can learn from experience, make decisions, and perform human-like tasks. AI can enhance the speed, precision, and effectiveness of human efforts.
Examples of AI include personal virtual assistance tools like Siri and Chat GPT, recommendation systems, language translation tools, spam filters, recognition software and self-driving cars.
How charities are using AI
Charities' use of AI covers a range of areas aimed at making their operations more efficient. Some examples include:
- boosting learning outcomes
- improving fundraising capability
- fast-tracking medical breakthroughs
- addressing the impacts of climate change
- improving aged care delivery
- increasing agricultural production.
Examples of charities using AI
- A legal services charity strengthened the support it delivered by developing a natural language AI tool which helps identify people’s legal issues and then connect them to the most appropriate support.
- A charity with a high reliance on donor funding used AI to improve the efficiency of their fundraising appeals to segment and more accurately target donors. The initiatives significantly reduced campaign costs while increasing the revenue raised in appeals.
- A charity created an education dashboard system that, through the use of AI and data, aimed to address educational inequality by pushing real-time information to front-line staff.
Opportunities and risks for charities using AI
AI presents many opportunities for the charity sector to use technology to further their charitable purposes including:
- using insights to make more informed decisions
- personalising communication with supporters and donors
- being more efficient through the automation of tasks to free up resources for higher level activities
- managing volunteers more effectively through matching skills to opportunities
- helping more people through AI-powered assistance
- enabling a faster and more targeted response to disasters
- measuring success through a more informed understanding of program impact.
However, AI’s effectiveness is only as good as the data it draws on, and its uptake and acceptance by users. Charities must ensure they have enough useable data to gain the benefits, and to avoid any unintended consequences linked to using AI with poor quality data.
Before embracing AI, charities also need to consider the potential impacts on public trust and perceptions of stakeholders about any adverse impacts of AI on vulnerable groups. AI risks include:
- privacy issues where data security protection is inadequate
- AI done badly may cause harm by making poor, biased or discriminatory decisions due to designer biases built into the AI system, or because the AI system has been trained on bad data and information. Time needs to be set aside to check AI outputs for accuracy and unintended biases
- becoming disconnected from the charity’s purpose by losing the personal touch through reduced human connection and empathy. The purpose of AI is to enhance human decision-making, not replace it
- making unethical decisions when there is a poor understanding of the values built into the systems
- reducing accessibility to those with limited or no access to suitable technology, or where users have poor technical literacy
- concerns – and the possible loss of support – from donors worried about any AI use.
Charity responsibilities and AI
AI needs to be used mindfully and consideration given to the legal, ethical and compliance challenges its use might present.
And before embracing AI, your charity should consider opportunities holistically.
This includes ensuring appropriate governance frameworks are in place, and that your charity’s Responsible People exercise due diligence and ensure innovations using AI are adequately overseen by someone with the necessary skills and knowledge.
Also, public trust is one of the most valuable assets your charity has, and maintaining that trust is vital – especially when considering a course of action that has the potential to damage that trust.
Your charity must consider how it can adopt AI safely and responsibly, and in a way that doesn’t cause harm.
Some things to consider include:
- if AI is, strategically, the best solution for the problem under consideration
- how your charity will use technology to further its charitable purposes, and to retain what it is that makes the charity unique – for example, its human connection, generosity, and compassion
- if there are any moral or ethical issues you need to address in your intended use of AI
- how your charity will manage risks associated with AI whilst leveraging the potential benefits
- if your charity’s governance is sufficiently robust to manage the entire lifecycle of AI solutions, and to ensure systems are fit for purpose, safe, fair, reliable, timely, trustworthy, and supportable
- whether your charity’s data governance and AI governance arrangements will properly work together to protect important information assets
- if your charity’s cybersecurity stance and policy framework need to be updated to manage any additional cyber threats that using AI may expose it to?
- how your charity will inform supporters, donors and beneficiaries about its decision to use AI
- for some, AI use may be difficult to grasp unless real examples are available to demonstrate. You should think about how your charity can bring its use of AI to life for its audience while emphasising that your charity's use of AI will not see it lose sight of its core purposes and values.
Australia does not currently have a formal regulatory framework for AI.
Charities should, however, consider how their use of AI might impact their governance or registration obligations. This would include compliance with ACNC Governance Standards – particularly Governance Standard 5. Governance Standard 5 includes the duties for Responsible People to:
- act with reasonable care and diligence, including the need to ensure they meet the legal requirements of managing people’s information and data, and ensure their charity’s cyber security settings are appropriate
- ensure the financial affairs of their charity are managed responsibly
- act honestly and fairly in the best interests of their charity and for its charitable purposes.
- Engaging with AI | Australian Signals Directorate
- Voluntary safety standards when using AI | Federal Department of Industry Science and Resources
- Australia's AI Ethics Principles | Federal Department of Industry Science and Resources
- Artificial Intelligence factsheet | Not-for-Profit Law
- Governance of Artificial Intelligence | Australian Institute of Company Directors
- Artificial Intelligence and board minutes | Governance Institute of Australia
- Artificial Intelligence Hub for the charity sector | Charity Digital (UK)
- Ethics of Artificial Intelligence | UNESCO