Looking back to plan forward: Patrick Devine-Wright introduces the 2025 Annual Assembly Evaluation Report

Published on 27 November 2025


Patrick Devine-Wright

This blogpost from ACCESS Director, Professor Patrick Devine-Wright, introduces the 2025 Annual Assembly Evaluation Report and explains why we spend so much time and effort collecting and analysing our event data.

 

And so a fourth Access Assembly has come and gone, and a fourth Evaluation Report has been completed. Why go to all that effort to evaluate the event? To collect different types of data before and afterwards? To carefully look across that data with multiple eyes, to produce findings and come to final conclusions and recommendations? Wouldn’t it be easier just to get on with planning the next Assembly and not spend so much time and effort looking back?

A short answer to that question is that ACCESS is about championing social science, and the science part is crucial to how we work. Rigour is paramount. We go beyond gut feeling and ensure our assumptions and future plans are based on the careful analysis of evidence.

The second reason is that we are continuously refining the formula to produce ‘better’ outcomes. Each Assembly innovates on the previous blueprint. Year on year, we have increased the number of participants, while maintaining our focus on creating a welcoming and genuinely inclusive event that connects people and organisations across boundaries of discipline, sector and career stage. A key change in 2025, and one reason why evaluation is crucial, is that we put much more effort into creating an event that was more diverse in terms of ethnicity and race. I think we succeeded, but wanted to know for sure, especially because of the risk of tokenism in this space, for example, if a couple of extra people are invited, but without any real attempt to fundamentally change the atmosphere or nature of the event.

This all takes me back 25 years to my first postdoctoral research position, working on an EU funded project that involved collaboration with the local council on a renewable energy community. I attended many meetings with council officials drawn from planning, energy and sustainable development departments. It was eye-opening. My first glimpse of government bureaucracy close-up and the challenge of working across boundaries and siloes. The other thing that leapt out at me was the discourse in local government – everyone talked about ‘delivery’. Delivery was paramount. The focus was on action, not theory. Coming directly from my PhD studies, this took some getting used to. I came away with some respect for this delivery perspective, but two things nagged me.

First, even when people consider theory to be irrelevant or abstract, their plans for delivery, their expectations for outcomes, are nevertheless based on an implicit set of ideas about what will happen, what I would now call a lay theory, or theory of change. And I believe it is far better to have these ideas out in the open, to be explicit about them, to open them up to scrutiny and challenge, than it is to hide them away in an atheoretical discourse.

The second thing that nagged me from that way of working was the absence of evaluation. All the resources were put on action, on delivery. Diverting budgets for evaluation was considered a waste of resources, a luxury they couldn’t afford. Again, I found this troubling. If the action didn’t work out as planned, how would you know what element of the action was responsible for things going wrong? And even if it did come out right, how could you be sure that it was your actions that produced the outcome or result? Could it have happened by chance? It seemed to me far better to evaluate carefully and produce learnings about the action. That would then provide a foundation for replication or generalisation, so that you could convincingly argue to others that here was a formula that worked, and you had the robust evidence to prove it.

I’m still amazed at how much effort is put into organising events, conferences or workshops involving academics or policy makers, yet how little evaluation takes place afterwards. Amazed because ineffective events generate costs of all kinds, not only wasting financial resources or people’s time, but having personal impacts that potentially lead to a sense of exclusion, precisely the opposite outcome to what gatherings of people should be all about. The reference to alienation in this feedback from one participant at the 2025 Assembly is telling:

‘Whether in the panels, the attendees, or the conversations, it was clear that EDI was not an afterthought but embedded in the fabric of the event. I’ve had some strange (and sometimes alienating) experiences at conferences – whether that’s because I’m a woman, a woman of colour, or a woman of colour who is an early-career researcher – but in these last two days I felt like doors were opening, connections were being made.’ (First-time AA attendee, direct feedback received by email)

So here is the report. A careful analysis of what went well, and what could be improved. A team effort across both academics and professional staff in ACCESS. And more recommendations to inform planning for the 2026 Annual Assembly. That will be our final gathering, a celebration of social science and all that we have achieved to date. I can’t wait!