The incredible value of rapid review synthesis contributions #synthesis #openscience #evidencebased

Contemporary, complex issues in society and natural systems are best examined and informed by evidence. A recent contribution was done to explore two direct questions associated with the COVID-19 pandemic and pets. The description of the two driving questions and subsequent link to the paper is provided here. This is a brilliant tool to inform decision making and planning when a decision is needed immediately. One can envision numerous contexts wherein this tool can be applied to crisis management or situations in other disciplines in ecology and the environment wherein a rapid response is needed to mitigate a collapse or challenge. If you click though and read the review, there are several salient elements in this unique reporting formatting.

Key elements of a rapid synthesis review

  1. Apply a standard and replicable synthesis workflow. Here is the one I use, but any similar approach is viable provided you can support a preferred reporting item schema for meta-analysis and systematic reviews (i.e. PRISMA) but there are other guidelines.
  2. Scrape the literature with direct, testable questions as the focus. I propose testable be defined as need-to-know for crisis, but the examples I have seen also ensure that questions have the capacity to be answered in relatively binary terms and with a relatively high capacity to disambiguate related terms. You do not have the luxury of time to explore related concepts and synonyms.
  3. Use these questions to rigorously review the literature rapidly but transparently.
  4. Summarize the landscape of findings clearly by providing the number of publications and key sample sizes including the relative frequencies of key term conjunctions and concepts.
  5. Given the need for a rapid response, answer the questions with number of studies and not effect sizes. Describe the number of samples and methodology of studies that warrant description of outcomes to address a question.
  6. Finally, I would prefer to see a clear statement at the end of a rapid review reminding the reader/decision-maker that a. there are different forms of ‘no effect’ conclusions – i.e. limited evidence, many tests but no significant effects reported, likely heterogeneity, limited subset of methods; and b. a clear re-statement of the scope of the questions – i.e. how far can the answers be generalized based on the evidence summarized. In the example of COVID-19 and pets, describe taxonomic diversity in the literature to be able to safely, i.e. with reasonable confidence, conclude that pets are not vectors. This was evident in the reported results, but it is worthwhile to restate.

Steps to update a meta-analysis or systematic review

You completed a systematic review or meta-analysis using a formal workflow. You won the lottery in big-picture thinking and perspective. However, time passes with peer review or change (and most fields are prolific even monthly in publishing to your journal set). You need to update the process. Here is a brief workflow for that process.

Updating a search

  1. Revisit the same bibliometrics tool initially used such as Scopus or Web Science.
  2. Record date of current instance and contrast with previous instance documented.
  3. Repeat queries with exact same terms. Use ‘refine’ function or specific ‘timespan’ to year since last search. For instance, last search for an ongoing synthesis was Sept 2019, and we are revising now in Jan 2020. Typically, I use a fuzzy filter and just do 2019-2020. This will generate some overlap. The R package wosr is an excellent resource to interact with Web of Science in all instances and enables reproducibility. The function ‘query_wos’ is fantastic, and you can specify timespan using the argument PY = (2019-2020).
  4. Use a resource that reproducibly enables matching to explore overlaps from first set of studies examined to current updated search. I use the R-package Bibliometrix function ‘duplicatedMatching’, and if there is uncertainty, I then manually check via DOI matching using R code.
  5. Once you have generated your setdiff, examine new entries, collect data, and update both meta-data and primary dataframes.

Implications

  1. Science is rapid, evolving, and upwards of 1200 publications per month are published in some disciplines.
  2. Consider adding a search date to your dataframe. It would be informative to examine the rate that one can update synthesis research.
  3. Repeat formal syntheses, and test whether outcomes are robust.
  4. Examine cumulative meta-analytical statistics.
  5. Ensure your code/workflow for synthesis is resilient to change and replicable through time – you never know how long reviews will take if you are trying to publish your synthesis.