Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.
| 20 Jan 2026 | |
| Blogs |
AI is becoming embedded across the charity sector far sooner than any of us expected.
Traditionally, the non-profit world is slow to adopt new technologies. But this time, with AI, we’re definitely keeping pace, but we may need to pump the brakes so we can take a minute to ask some important questions.
As the internet edges closer to un-usability, with large language models learning from AI-generated content instead of natural language, and as every digital space becomes flooded with repetitive, sanitised, AI-produced slop, we’ve reached a point where being provably human is a genuine competitive advantage, especially in fundraising.
For donors, giving is no longer about guilt, traditional notions of altruism or fulfilling a moral obligation. It’s about belonging. It’s about joining something meaningful.
But creating something that donors and funders want to belong to only works when the story is authentically real and rooted in lived experience, not in AI-generated sentiment.
Fundraising and Communications teams need to think hard about the ethical and moral impact of using AI-generated imagery in fundraising and campaigns because when we use AI to create images of the people we support, we are unintentionally removing their voice and their lived experience from the narrative. Even when the intention is good, or when an image is presented as a generic “archetype”, the result is often further marginalisation. The person becomes an idea rather than an individual.
In a sector built on trust, representation matters. Lived experience cannot be synthesised without loss. When we generate faces, bodies, or scenarios instead of working with real people, we risk erasing the very perspectives we’re trying to centre.
Yes, there is a cost saving in generating images and stories this way. But there is also a moral cost. Who gets represented? Who gets excluded? Who gets to decide what disability, poverty, or hardship looks like?
These are not abstract questions. They go to the heart of power, agency, and dignity. If AI is used without care, it can quietly replicate the same hierarchies and biases our sector exists to dismantle.
This is why intention matters more than efficiency. AI can support our work, but it should never replace the people at its centre.
In 2026, first-hand, authentic storytelling will be one of your organisation’s most valuable assets.
We should also expect to see a shift in what donors and funders demand of partnership and collaboration. “Was this image AI-generated?”. “Were these quotes written by a real person?”. “Can I see who this actually impacted?”. These are the types of questions we should expect to have the answers to moving forward.
If we want to build campaigns that resonate, we need to prioritise human connection over artificial perfection.
Because the more artificial the world becomes, the more people will pay attention to what’s still real.
Written by John Harvey, Communications Manager, Vision Ireland
Find John on LinkedIn: https://www.linkedin.com/in/johnharv/