Emma Bracegirdle, founder of The Saltways, explores how charities are quietly adopting AI-generated imagery, the legal and ethical risks they are walking into, and why trust and transparency will determine who thrives.
________________________________________________________________
There is something happening inside charity communications teams. Staff are reaching for AI-generated imagery to solve very real problems.
We all know what it feels like - tight deadlines, stretched budgets, the impossibility of filming in certain communities. So, you reach for a solution. But this is being done without frameworks, without disclosure policies, and in many cases without fully understanding what they are risking.
Over the past few months, we surveyed 116 charity communications professionals and spoke in depth with eleven of them to find out what was really happening with AI generated imagery in the third sector. The results were unexpected.More than half of those using AI imagery were not labelling it consistently, and 94% were unclear about their legal disclosure obligations. These were not bad people making bad decisions. The sector is simply operating in the dark on this, and assuming a great deal it has not yet stopped to question..
What struck me most was not the scale of use, but the absence of any conversation around it. These decisions were being made quietly, at an individual level, inside tools that staff already use every day.
Then there are the legal risks. The legal landscape is shifting faster than most teams realise. Under the EU AI Act, obligations for deployers, which is what charities become when they use AI tools in their communications, are already there, with a final Code of Practice expected in June 2026 and obligations applying from August. Where imagery could reasonably be considered designed to appear real, disclosure requirements apply. The sector is largely unprepared for what is coming, and the window to get ahead of it is narrowing.
Here is where it gets interesting, though. The same research that surfaces all of these risks also reveals something that should encourage the sector. Seventy-six percent of respondents agreed that authentic content helps charities stand out and build donor trust.
Supporters are already becoming more sceptical about what they see online, and the organisations that choose transparency, that lead with real stories told with real care, and that are honest when they do use AI and clear about why, are the ones that people will increasingly choose to trust with their money and their loyalty.
The standards the sector developed around beneficiary consent, dignity, and ethical representation were hard-won over many years. AI imagery used without thought risks undermining that progress far more quickly than it was built. The charities that will be trusted five years from now are the ones making those values visible in their content decisions today, including the small ones, including the ones nobody is watching.
Trust, authenticity and transparency are the sector's superpower - let's embrace them!
Download the Report and your free guidelines here: https://mailchi.mp/thesaltways/aiguidelines










Recent Stories