
Decisions aren't always easy, but it definitely helps if you have good evidence to base them on. And the best evidence is that which is available, understandable, useful and assessable.
Vangelis Danopoulos is one of the researchers from the Department of Pure Mathematics and Mathematical Statistics who are working out how best to translate scientific evidence to inform policy makers – and you or me – when we make decisions that affect our lives.
Danopoulos is part of a research group led by Sir John Aston, Harding Professor of Statistics in Public Life. The team are developing cutting edge techniques in statistics with the aim of supporting policy makers and people in public life to use the best quantitative evidence available. They use a very wide range of techniques and apply them to many settings – for example, using spatial statistics to research how and why house prices change over time, or applying innovative image analysis techniques to medical imaging to improve diagnosis and outcomes for patients.
Evidence synthesis
Danopoulos sits on the policy side of Aston's research group, using his background in evidence synthesis to bring together and critically appraise evidence with the aim of informing policy decisions. Researchers often review studies from across an area to build an overview of the current state of research in that field. But the systematic reviews that Danopoulos conducts are different: rather than giving an overview, the purpose of these is to answer a specific research question. If the evidence used to answer this question is numerical it can be statistically analysed to produce a meta-analysis.
Evidence synthesis methods were first created for medical and healthcare research. "For example, if we have different [treatments] for a disease we want to look at the different studies and bring the data together to find which one is best," says Danopoulos. Nowadays these methods are used in other fields as well.
"My first systematic reviews were in environmental science and environmental health – a series of systematic reviews around the level of microplastics in food." Most recently Danopoulos and his colleagues have finished a systematic review looking at policy recommendations about the use of artificial intelligence in achieving net zero. "We were specifically looking at wind power and hydrogen energy and improving the energy efficiency of lorries. So [systematic reviews] are used in very diverse and very different fields from health care."
Transparency
"What makes systematic reviews so useful, and the reason I like them so much, is that transparency is baked into the process," says Danopoulos. At every step – finding all the available evidence from primary research, appraising the quality of this evidence and finally bringing it all together to answer the research question – you need to record the process that you used and justify the decisions that you made. "What you get at the end [of the systematic review] is a very clear report of what you have done, why you have done it, and justifying all your decisions along the way."
While it could be seen as "pedantic", it is this rigour that makes the systematic review reproducible, a vital aspect of scientific research. The methods allow Danopoulos and others to extend the primary research, using the larger pool of collected data to answer wider research questions, increase our confidence in the findings or increase the precision of the results. "It makes our research findings more robust."
Danopoulos and his colleagues have also made their approach available for others to use. During their recent research on the AI for Net Zero project they needed a standardised way of assessing evidence communication and policymaking recommendations. In response they developed a new standardised tool, Evidence Communication Rules for Policy (ECR-P), which is also available as an app. The ECR-P is based on work by the Winton Centre for Risk and Evidence Communication and provides a structured set of around 25 questions that assess the paper according to the Winton Centre's Five rules of evidence communication. This tool can be used during a systematic review for appraising not just the quality of the evidence, but also the evidence communication and any policy recommendations. In their findings they noted that some studies contained policy recommendations that were not well connected to the research being presented, and that sometimes evidence was not communicated well.
Good communication
"[The ECR-P], although tailored for the needs of a systematic review, can also be used by any researcher who wants to communicate their policy recommendations. [It is a] guide of how to create the best possible policy recommendation, both in terms of quality of communication, but also quality of the policy recommendations themselves.
Danopoulos has also been talking with policy and decision makers about how ECR-P can help them. "We're thinking of creating a more easy-to-use tool specifically targeted at policy makers. So if you are a policy analyst in any setting, you can pick up a set of, let's say, ten questions that will help you appraise the quality of policy making recommendations in a scientific paper in a simple way."
Understanding both sides of this conversation, between researchers and policy makers, is important in taking this research forward. Danopoulos is able to draw on his experience working as a policy analyst for assessing medical interventions for The National Institute for Clinical Excellence (NICE), as well as his experience as an academic researcher. "I do enjoy it. If you want to create something for policy makers we have to understand what their needs are and what they would find helpful. We can't just speak from the top of the academic mountain. We need to figure out what they, and we, want."