The incredible value of rapid review synthesis contributions #synthesis #openscience #evidencebased

Contemporary, complex issues in society and natural systems are best examined and informed by evidence. A recent contribution was done to explore two direct questions associated with the COVID-19 pandemic and pets. The description of the two driving questions and subsequent link to the paper is provided here. This is a brilliant tool to inform decision making and planning when a decision is needed immediately. One can envision numerous contexts wherein this tool can be applied to crisis management or situations in other disciplines in ecology and the environment wherein a rapid response is needed to mitigate a collapse or challenge. If you click though and read the review, there are several salient elements in this unique reporting formatting.

Key elements of a rapid synthesis review

  1. Apply a standard and replicable synthesis workflow. Here is the one I use, but any similar approach is viable provided you can support a preferred reporting item schema for meta-analysis and systematic reviews (i.e. PRISMA) but there are other guidelines.
  2. Scrape the literature with direct, testable questions as the focus. I propose testable be defined as need-to-know for crisis, but the examples I have seen also ensure that questions have the capacity to be answered in relatively binary terms and with a relatively high capacity to disambiguate related terms. You do not have the luxury of time to explore related concepts and synonyms.
  3. Use these questions to rigorously review the literature rapidly but transparently.
  4. Summarize the landscape of findings clearly by providing the number of publications and key sample sizes including the relative frequencies of key term conjunctions and concepts.
  5. Given the need for a rapid response, answer the questions with number of studies and not effect sizes. Describe the number of samples and methodology of studies that warrant description of outcomes to address a question.
  6. Finally, I would prefer to see a clear statement at the end of a rapid review reminding the reader/decision-maker that a. there are different forms of ‘no effect’ conclusions – i.e. limited evidence, many tests but no significant effects reported, likely heterogeneity, limited subset of methods; and b. a clear re-statement of the scope of the questions – i.e. how far can the answers be generalized based on the evidence summarized. In the example of COVID-19 and pets, describe taxonomic diversity in the literature to be able to safely, i.e. with reasonable confidence, conclude that pets are not vectors. This was evident in the reported results, but it is worthwhile to restate.