Fishing for impact 

Large school of sardines swimming in unison, with a blue background.
Lucy Rodwell, our Impact Facilitator, reflects on explaining the Sprint model and working at Agile.

When I was offered the job as Agile Initiative Impact Facilitator, I tried to explain the Sprint model to academic and professional services colleagues. “But it’s not possible to deliver primary research within a year!” they replied. Why? Because the ‘traditional’ research model is entrenched within our thinking and systems, even when exponential changes to our society and environment are demanding greater innovation and versatility. 

My go-to analogy for Agile’s mission is that it’s trying to do what the Covid vaccine did: create an expedited process to deliver what’s needed to those who need it when it’s needed. Agile has not just shown that you can deliver primary research within a year, but also that you can deliver outcomes and impacts in a very short timeframe, too. 

I should know, because each year I have collated everything attributable to Agile and reported on it to UKRI through the Researchfish platform. For now at least, this is still a requirement of all research funded by the UK government via UKRI. 

Big shoal, small net 

Agile’s flexible funding has enabled us to issue several calls for Sprint projects, based on iterative learning from each round. In this way, we have operated as a mini–Oxford Martin School, generating pioneering interdisciplinary research projects to meet urgent policy-relevant environmental needs. Our research portfolio now consists of 18 Sprint projects, five rapid science-to-policy projects, and an environment and national security research project, convening researchers from many disciplines to work together on a forward-facing research agenda. On top of these we have programme-level outputs and activities. So, it’s a big shoal. 

Now, imagine a small, rigid net trying to capture the KwaZulu-Natal sardine run. The Researchfish ‘net’ reports on publications, collaborations and partnerships, further funding, researchers’ next destinations, engagement activities, datasets, databases and models, software and technical products, awards and recognition, and influence on policy, practice, patients and the public (more on this one later). While this list covers many types of outputs and activities, the tool has its limitations. Most of these have quite prescriptive sub-categories, not all of which allow you to write a description via a free text field. 

My process has been to convert the categories and sub-categories into a spreadsheet for each Sprint team, outlining the data requirements for each. I pre-populate these with the Sprint data amassed by the programme administrative team, and provide what was submitted in the previous year for reference. These spreadsheets are then sent to the Sprint teams to correct and update prior to the submission window. Why pre-populate the spreadsheets? Because the Sprint teams work incredibly hard, and sending an empty spreadsheet to fill in feels like a human rights violation. To be clear, it’s still onerous on all concerned, but creates a trackable process and year-on-year comparative data. 

Spearing impact 

Impact is a slippery customer at the best of times. Colleagues from other research programmes have muttered darkly about “nailing jelly to a wall” when it comes to substantiating impact with evidence. Then there’s policy impact, the wriggliest and most elusive catch. Drawing a distinct line from research uptake to policy influence is undermined by complex policy impact pathways, given the cross-cutting nature of most policy, multitude of evidence sources, and political factors that come into play. 

Researchfish does not report on ‘impact’ in any meaningful sense, and in my view the core category (influence on policy, practice, patients and the public) reflects a stale and limited conceptualisation of impact, slightly skewed towards the medical sciences. As a workaround, I have used free text fields in other sections to demonstrate causality between engagement activities, innovation, outputs, outcomes and impacts specific to Agile and Sprint objectives to demonstrate what the funding has enabled. But even this is limited. Data is scattered across different categories, and Researchfish does not capture interdisciplinary cohort-building and research culture influence that are key drivers in NERC’s Changing the Environment programme. 

For these reasons, Researchfish can feel like a wasted exercise. Therefore, the trick is what you do with the Researchfish data. My approach is to make the most of the ‘key findings’ and ‘non-academic impacts’ narrative statements in the system, in which you can provide progress summaries linking key findings and impacts to programme objectives, that can act as a summary for funders.  

As a progress monitor, Researchfish can provide quantitative data on the types of publications, activities, tools, and models generated, and how these change in line with our iterations of Sprint learning. When combined with our monitoring, evaluation and learning data and qualitative researcher and stakeholder feedback, Researchfish data can act as the framework for nuanced programme-level reports to share with our funders and inform strategic decisions. 

Inspiration 

As a by-product of working out how best to meaningfully report on our ambitious programme, Agile has inspired me with the policy influence and practitioner impact it has achieved to date.  

Sprint research has  

  • helped to shape Brazil and Northern Ireland’s climate change policies 
  • created tools such as nature-based solutions opportunity maps and community engagement guidance that enable practitioners to effectively deploy place- and community-sensitive nature recovery strategies 
  • produced an algorithm capable of radically ‘spinning-up’ Earth System Modelling computational time to enable faster and more accurate predictions of human-made climate impacts on the global climate system.  

These are just a few examples. 

Uncertainty and urgency 

Both Agile and Researchfish are ending. It’s been a wild ride. Like the programme, my role has involved experimentation, iteration, and adaptation to refine our learning on how we put this all into practice. As Agile’s Impact Facilitator, I have been learning, providing advice and reporting on our experimental trajectory in an institution and reporting landscape that are not set up for programmes such as Agile. We don’t yet know about Researchfish’s replacement, or how the Agile Sprint model will align with future funding priorities. Our challenge now is to embed our learning within a sector wrestling with uncertainty. 

Within the University, the Sprint model could have multiple applications, such as a means of accelerating policy-engaged research careers, or generator of spinouts. However, the most fundamental learning I have gained from Agile and the current era of instability is the importance of academic research providing much-needed advice and solutions for urgent societal challenges. This requires changes to the research model, its environment, and formal mechanisms for meaningful reporting on impact. This is why Agile’s learning is so important now.