Related

Gemma Rocyn Jones is an experienced grant maker and social investor who is now part of London Funders' Talent Bank. Gemma shares her insights about what we should be thinking about in the context of where the government’s tech policy is heading. She looks at what this all means for the digital capacity and capability of civil society if they are to benefit from the developments happening at pace and poses some questions for funders about their role. She argues we need to see ourselves as makers and shapers of AI, not merely 'users and responders'.
The pace of change when it comes to AI is astounding. It is less than two months since the Prime Minister launched the UK’s AI Opportunities Action Plan in January yet over that time we’ve seen the field of AI morph and the debate hit new highs of hyperbole. It would be nice to filter this out as white noise whilst we focus our energy on more immediate and pressing issues.
The thing is, how AI is developing and the tech policy being adopted in the UK mean that generative AI is an immediate and pressing issue that needs our attention today. AI is already impacting our daily lives whether we opt in or not. Young people are using it continuously, and research by the Joseph Rowntree Foundation and WeandAI suggests that over three quarters of people in non-profit organisations are using AI in some capacity.
While FOMO is not a strategy, as Rachel Coldicutt, one of the strongest and most respected voices in this space, has rightly pointed out, we are way beyond the stage of waiting to see how things pan out before acting.
If we are to shape the future in a way that serves communities and civil society, then funders need to be acting now.
It is this sense of shared responsibility that brought 84 funders together, hosted by the National Lottery Community Fund innovation team in partnership with CAST, at the Bishopsgate Institute in February to start a conversation about building a collaborative approach and vision.
First though, we need to unpick what exactly we need to respond to, what is the government planning and what might that mean for our communities?
Amongst the many key announcements the government has made over the last few months, its plans to drive AI adoption across the public and private sectors are perhaps its boldest.
International governments at the 2025 AI Action Summit in Paris made a collective commitment to ensuring AI is open, inclusive, transparent, ethical, safe, secure and trustworthy. While the UK government was not a signatory to this commitment, they have a stated an ambition to be the best place in the world to start and scale an AI business. In the Prime Minister’s words, they are putting "the full weight of government" behind this ambition.
Civil society is notably absent from the current plans, but when you drill down into the details, there are very real implications for funders to consider:
1. Opening up our data for AI innovators and researchers
The creation of a National Data Library will open up key public datasets to AI researchers and innovators. In the first instance, they will focus on five high-impact datasets, but the scope and governance arrangements that are put in place will influence what future data is collected and trust in how it is being used, particularly where private interests are involved.
From an impact perspective, this linking up of datasets such as household, education, and employment data, for example, could mark a shift in our ability to support individuals and households facing difficulty. Government data is difficult to access and hard to join up, so this would mark a huge leap forward.
As ever, the devil will be in the detail. Could the requirements on data collection and sharing be extended to all charities in receipt of public funding through grants or contracts, and if so, what support would they be given? Could the obligation to share data extend to the data held by independent funders to give a more complete picture of the activities taking place? What could this bring to a national data library?
In the absence of more detail from the government, independent organisations like the Wellcome Trust and Tony Blair Institute for Global Change have been quick to open up discussions on the vision for the library and have developed detailed proposals for how it could be designed, governed and used.
What about the views of the organisations who are working every day with young people facing homelessness and families in crisis? What would be their priority areas for research?
2. Investing in digital capacity and capability
Making use of this newly available data and technology requires skills and time, so it makes sense that the government is prioritising investment in a new generation of tech innovators and researchers. But we can’t take for granted that these newly trained next generation of AI scientists and founders will have an accurate understanding of the communities that civil society serves, the role that civil society organisations play or the most pressing problems that AI could help us address. Without this understanding, it is less likely that their efforts will benefit civil society directly and more likely that unintended harms creep in.
Funders and charity partners are already collaborating across a wide number of social issues. A strategic response would encourage collaboration with or investment in AI researchers working at a sector or subsector level on the priority problems and opportunities identified by these groups. The findings and results could then be shared more widely for the benefit of the wider sector.
Thinking about the digital capacity and capability of civil society at large, it is unlikely that many of the cash-constrained organisations we fund will be able to prioritise the resources needed to train their staff and attract the top digital talent needed to harness these data sets. Digital literacy across a charity workforce of 1 million is not a new problem, and the Catalyst initiative, which grew out of a funder collaboration on digital literacy, provides a good starting point for how funders could respond.
3. Expanding the remit of every government regulator to drive adoption
Finally, there is an explicit expectation from the government that regulators will be responsible for driving the adoption of AI in their sector. For civil society, this means the Charity Commission.
There is a conversation to be had with the Charity Commission, whose public comments to date on AI have focused on how charities are using AI in their operations and funding applications and about how they interpret their role.
AI adoption has the potential to transform our understanding of charity business models, financial health and which communities are or are not receiving funding, enabling us to target our support better and earlier.
Thinking back to the pandemic, this would have made our impact much greater when it came to working out the level of emergency support charities and social enterprises needed.
Could the Charity Commission also play a role in establishing a sandbox for safe AI experimentation in a similar way to the FCA sandbox for innovation in financial services? It can’t be expected to do this alone, and funders have an important role in ensuring experimentation is appropriately resourced and done for change and benefit, not just efficiency.
How charities and funders are already responding
The AI and Funder Conference made clear just how much experimentation is already taking place within grant making organisations and charities; thanks to the great mapping and convening work done by CAST, we can see that funders are testing AI across the whole of the grant life cycle – from programme design to application, from decision making to back office, to impact and learning. It’s happening through structured prototyping, such as the National Lottery Community Fund’s development of an eligibility checker or Scope’s partnership with Deloitte to use generative AI to create a more inclusive workplace and informally through individual experimentation.
What isn’t clear, though, is how these efficiencies are being used. There was a strong feeling in the room that this time should be given back to do what only humans can do, namely, build relationships and connections with the communities we are serving.
This underlines the point that this issue is about more than how to harness a new technology to improve efficiency; it is a social and ethical question.
The backdrop to the conference was whether the concept of Worthwhile AI might be possible, namely an AI-enabled society where technology is used ethically and testing is done safely and responsibly. Not easy when, as one of the panellists highlighted, there isn’t a definition for ethical or responsible AI, and we don’t and can’t know what it might be capable of or how these changes will play out.
So how do we ensure that our use of AI is meaningful? This is where the social question comes in.
As leaders, this means getting comfortable having a conversation about something we don’t have answers for, applying critical thinking and importantly seeing ourselves as makers and shapers of AI, not merely users and responders.
Leadership
This brings us back to leadership. Funders have choices about how they ensure communities are at the heart of these changes and that civil society is not left behind. They also have the collective power to make this happen.
This is about shaping the future rather than watching it be designed by others.
The newly launched Charity AI Task Force, of which London Funders is a member, is a significant milestone in this journey.
The new task force is currently aligning around four key missions - although these are expected to evolve over time:
One of the first tasks will be to respond to the government’s AI Opportunities Action Plan.
Back in January, the Prime Minister put a clear marker down in the sand about his ambition for the UK to be a leading centre for AI. The question now is, what marker do funders want to put down?
To learn more, check out the Charity AI Task Force and AI in Grantmaking Peer Group. We’d also love our members to join our March Insight meeting, which will be exploring AI in more detail – you can find out more and sign up here
Gemma Rocyn Jones is an experienced grant maker and social investor focused on using finance as a lever for powerful change. After nearly seven years at the National Lottery Community Fund, where latterly she was involved in the development of its 2030 strategy and creation of its innovation team, she now works independently with foundations, investors and changemakers. Gemma is part of the London Funder’s Talent Bank.