Charities need to proceed with caution when using AI, the Charity Commission has warned in a newly published blog.
The sector's regulator says that "while there are opportunities, it is wise to proceed with caution as there are risks involved that need to be considered and managed".
Charities may also need to consider having an internal AI policy, the watchdog added.
In its latest blog, the watchdog references the 2023 Charity Digital Skills report, which suggested that 35% of charities were already using AI for certain tasks and that a further 26% had plans to do so in the future.
However, it reminded charities that AI must not be relied upon for decision-making processes undertaken by the trustee board.
"Trustees remain responsible for decision making, so given the consequences if incorrect advice is relied upon, it is vital this process is not delegated to AI or based on AI generated content alone," the regulator said.
"For example, trustees may not be complying with their duties if a charity relied solely on AI generated advice to make a critical decision about their charity without undertaking reasonable independent checks to confirm its accuracy."
Currently, the regulator does not anticipate producing specific new guidance on the use of AI, preferring “to encourage trustees to apply our existing guidance to new technologies as they emerge”.
It will update guidance where appropriate to reference examples of new technology, it added.
Recent Stories