How to utilize rapid evidence review in the public sector

By Jess Silverman

In the fast-paced world of government, public servants are often faced with digesting and understanding information rapidly on topics outside their area of expertise. In our most recent InnovateUS workshop, 92 public servants from across the country heard from Peter Bragge, the Director of Monash Sustainable Development Institute’s Evidence Review Service (ERS) about how to utilize rapid evidence review to get up to speed on existing solutions to a problem

Peter specializes in evidence reviews to support better decision-making for policy and practice. His work delivers research and practice reviews to a diverse range of government and industry organizations, including The Australian Department of Health, the Australian Commission on Safety and Quality in Health Care, and the Royal Commission into Victoria’s Mental Health System. He also leads Monash Sustainable Development Institute’s collaboration with McMaster University in Canada to build the world’s largest evidence resource for the Sustainable Development Goals (SDGs) – Social Systems Evidence (SSE). 

Unnamed (6)

Peter began the 90-minute workshop by asking participants to identify where they were from, what department they worked in, and to explain their experience with research reviews. Responses varied greatly, with participants working in all kinds of industries, from human services, to energy, to emerging infections. The public servants in the workshop represented the country from coast to coast, with participants calling in from states like California, New Jersey, Illinois, and Colorado. 

He then established the key goals and messages that he would explain further in his presentation. This included: 

  • Getting the question right is the most important step in any research review
  • Looking for research reviews will get you further faster
  • Research varies in quality
  • Research is not the only input into good decision making

Based on his experience working with government and other organizations outside of academia, Peter described the complexity of defining the term “evidence.”

“Even that term [evidence] is really complex,” he said. “A testimony of an expert is evidence, the findings of an order is evidence, information from a consulting report is evidence … When I use the term evidence I mean university research, but I know other people use the term evidence in different ways.”

The presentation guided participants through the five principles of evidence reviews. 

  1. Understanding the scope and context
  2. Get the question right
  3. Look for needles, not the haystacks
  4. Quality matters
  5. Research doesn’t have all the answers

In his explanation of each of the principles, Peter emphasized the importance of knowing the extent of what you are researching and that you are reviewing the appropriate topic to get the result you want. 

“Two of the five principles are about getting the question right. One of the conundrums is when you’ve got a very limited amount of time. But that’s the time when you really need to get the question right,” he said.

Getting the question right is all about context. Peter explained that it’s important to review who wants to know the answer to the question, what is already known/decided about the question, what they want to do with the question, and what are the other decision-making inputs.

He then discussed the importance of “looking for needles, not haystacks.” This part of the presentation helped participants identify where to start when looking for evidence reviews. He then directed participants to several useful resources they can use to access these kinds of reviews, including Google Scholar, Social Systems Evidence, and the Cochrane Library, among others. 

However, he warned participants that not all reviews are high quality and that it is essential to pay attention to the source material you are using before incorporating it into your work. Before trusting a resource, Peter said, be sure to examine the credibility of the publisher, the credibility of the institution, and the credibility of the authors.

“Just because it’s research doesn’t mean it’s good … There is plenty of bad research out there and the peer review system is not perfect,” Peter said.

As Artificial Intelligence, or AI, continues to develop, critics often write off the importance of Peter’s efforts because they believe this technology is capable of doing the work his team does. However, Peter explained to his audience that this technology is not able to replace the work he is doing just yet. While AI can make more time-consuming tasks easier, there are still essential components of the process that it is not able to comprehend just yet.

“If [ChatGPT] can do some of the mundane tasks, I can do some things I believe ChatGPT cannot do, which is working out what the right question is, interpreting the local context, understanding the other inputs, and decision making,” he said.

Following the presentation, participants engaged in a 30-minute question-and-answer session with Peter. One public servant asked how they can be sure their review isn't affected by their bias. Peter offered guidance and outlined two important strategies. 

“This is why it’s important to describe how you’ve gone about searching for evidence and there are two checks and balances that are really important. One is that you search pretty extensively. The other is that you specify what is going to be included and excluded,” he said. “If you’re working with a small resource budget, you can still employ those principles and you can employ either a second person to make those decisions or you can have an independent expert or peer reviewer look at it. 

Another individual asked how to effectively communicate the results of a research review to stakeholders. Peter suggested utilizing a variety of creative mediums to get complex ideas across. For example, his organization uses a graphic designer. 

“It’s not easy and often we run out of time and money to do it well, but when we do have time and money, we invest in graphic outputs, evidence maps, and the other thing we are exploring are audio digests,” he said. “It’s all about the audience.”

In feedback submitted by participants after the session, 100% of respondents said they would recommend this training to a friend or colleague, and 95% said they would utilize what they learned in their work. Here is what some of them had to say:

“It was interesting, easy to follow, and had practical recommendations that I can use.” -New Jersey participant, mid-career (10-20 years)

“The conversation about communicating with stakeholders was very interesting and helpful.” -Colorado participant, early career (less than 10 years)

“These trainings provide an insight on how to improve my job with every training.” -Maine participant, early career (less than 10 years)

You can view the recording of this workshop here (Passcode: 3y?q4d9&). Sign up here to participate in a future InnovateUS workshop!

Want to be a part of our community of innovators?

We'd love to keep in touch!

Three icons stacked horizontally, including: Creative Commons logo with the letters 'cc' in a black outlined circle, next to the attribution logo with a person icon in a black outlined circle and the letters BY below, next to the attribution-sharealike icon with a circular arrow in a black outlined circle and the letters SA below.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.