The incredible value of rapid review synthesis contributions #synthesis #openscience #evidencebased

Contemporary, complex issues in society and natural systems are best examined and informed by evidence. A recent contribution was done to explore two direct questions associated with the COVID-19 pandemic and pets. The description of the two driving questions and subsequent link to the paper is provided here. This is a brilliant tool to inform decision making and planning when a decision is needed immediately. One can envision numerous contexts wherein this tool can be applied to crisis management or situations in other disciplines in ecology and the environment wherein a rapid response is needed to mitigate a collapse or challenge. If you click though and read the review, there are several salient elements in this unique reporting formatting.

Key elements of a rapid synthesis review

  1. Apply a standard and replicable synthesis workflow. Here is the one I use, but any similar approach is viable provided you can support a preferred reporting item schema for meta-analysis and systematic reviews (i.e. PRISMA) but there are other guidelines.
  2. Scrape the literature with direct, testable questions as the focus. I propose testable be defined as need-to-know for crisis, but the examples I have seen also ensure that questions have the capacity to be answered in relatively binary terms and with a relatively high capacity to disambiguate related terms. You do not have the luxury of time to explore related concepts and synonyms.
  3. Use these questions to rigorously review the literature rapidly but transparently.
  4. Summarize the landscape of findings clearly by providing the number of publications and key sample sizes including the relative frequencies of key term conjunctions and concepts.
  5. Given the need for a rapid response, answer the questions with number of studies and not effect sizes. Describe the number of samples and methodology of studies that warrant description of outcomes to address a question.
  6. Finally, I would prefer to see a clear statement at the end of a rapid review reminding the reader/decision-maker that a. there are different forms of ‘no effect’ conclusions – i.e. limited evidence, many tests but no significant effects reported, likely heterogeneity, limited subset of methods; and b. a clear re-statement of the scope of the questions – i.e. how far can the answers be generalized based on the evidence summarized. In the example of COVID-19 and pets, describe taxonomic diversity in the literature to be able to safely, i.e. with reasonable confidence, conclude that pets are not vectors. This was evident in the reported results, but it is worthwhile to restate.

Innovations for the @ESA_org #ESA2020 annual meeting #FlattenTheCurve #ecology #openscience

The annual meetings of the Ecological Society of America are brilliant. These conventions provide the opportunity to network, collaborate, communicate, connect early-career researchers with new collaborators, and most importantly co-learn and connect as a community. We are scientists, citizens, educators, and a group that can affect social good. The annual meeting this year is in Salt Lake City, Utah entitled ‘Harnessing the ecological data revolution’. However, as citizens, we also have a moral responsibility to #flattenthecurve of a pandemic. In doing so, there is uncertainty on the duration of its spread (and mitigation strategies proposed) with predictions ranging from 6 weeks to many months. We have an opportunity to do social good. Even more importantly, we have an incentive to experiment. After all, we are scientists, and all large meetings certainly come with benefits but also challenges. Navigating a large convention (an app helps), rushing between talks, choosing between concurrent sessions, high relative carbon footprint and other costs of attendance, and now there is a potential moral imperative to consider alternatives. The theme of the meeting this year is also an ideal opportunity to experiment with alternatives. I propose that we test a new way to meet this year. We can even collect data from all participants on the efficacy of the strategies tested, and thus augment and improve future meetings that can reuse some of these approaches whether face-to-face or distributed. A remote meeting with reduced registration costs for instance can provide the following benefits:
a. protect the society from cancellations and loss of entire fees,
b. increase accessibility to those that could not attend because of net costs or other reasons, and
c. broaden our reach outside our community with some components of meeting being made fully accessible.

Innovations

iteminnovationdescription
1virtual poster sessionsThe Entomological Society of America already does this
2pre-recorded short talksReduce all talks to 20 slides, 6 minutes, pre-recorded, and post to an ESA2020 YouTube Channel. Allow individuals to post whatever content they are comfortable providing (deck only, deck with audio or notes, deck with video etc).
3define a common slide deck repositoryF1000, Figshare, and others provide a DOI for items. Benefits include citable objects, use common tags, and set rules for session number in title to enhance discoverability.
4each session co-authors a preprintIdentify two leads per contributed oral session/symposium, and all presenters co-author a short pre-print for session and publish to EcoEvoRxiv
5provide extemporaneous virtual meetingsIt is fantastic to bump into people at meetings. Provide a tool in some format and define mingling times. Every meeting should have this whether in-person or remote. Health, child care, accessibility, costs of food, social pressures to meet in bars or different contexts etc can be addressed with this tool. We can be more inclusive.
6provide a code of conductThis needs to be provided for every meeting in every context. Develop one for this meeting, and we can co-learn as a community have to better respect and enable representation and diversity. Consider a volunteer promote to monitor online discussion and support positive dialogue at all times.
7stream live workshopsUse Zoom, Twitch, or any platform to enable and stream live coding, writing, training, and discovery events.
8capitalize on social media (appropriately)Use social media more effectively. Pre-designate tags, run a photo contest on Instagram from your study work that you present, consider a code snippet sharing event, run a cartoon or outreach contest.
9hackathons or kagglesThis meeting is about the data revolution. Define and plan a kaggle or two and a few hackathons. We can collaborate and make the presentation process less one-way and fixed and much more interactive.
10publish data nowThe ESA can use this meeting as a means to advance the revolution by providing incentives to provide whatever scientific products individuals are able to provide (data, data snippets, code, workflows, syntheses, etc) at their points in career, degree, or research process.

Tips for student reports using online conferencing tools

Video and web conferencing tools will be used for upcoming undergraduate and graduate student progress reports at various institutions beginning this week. With my collaborators, graduate students will be using Zoom specifically to run their annual progress reports that comprise a 12-15 minute presentation followed by discussion and questions. Undergraduate thesis researchers will also present their thesis research and field questions to complete their course requirements. Previously, I have tried to best understand how to use these tools including recordings to present mini-conference talks, but the goal was primarily rapid, informal communication with limited needed for dialogue. Consequently, I have been considering how to best the support the team in the next few weeks through more effective use of video and web conferencing. Through trial-and-error this previous week, here are some ideas to consider if you are about to employ similar tools.

Tips

  1. Set up the conferencing session with a buffer of at least 15 additional minutes.
  2. Log in early, and test audio and video. I prefer headphones with mic to avoid reverberated sound. The presenter should also test turning on and off screen sharing.
  3. Check the settings when you set up the meeting. Confirm that you want participants to be able to log in muted and/or with video and in advance of the host. I choose muted entry and allow log in before host just in case this person is late.
  4. Designate a host for a meeting. With Zoom, only the host can record, and this can be handy (at least for the saved chat if not video).
  5. Discuss with all participants the rules of conduct briefly for the meeting (including whether recorded or not), and the host should introduce each participant in the conference with a brief hello or response from each to ensure everyone is seen or heard (and also that the settings worked).
  6. Mute yourself when not speaking.
  7. Consider turning off video or at least discuss because this can be very distracting to the presenters. These is also the remote possibility that this can improve sound quality (apparently participants tolerate poor to no video but not poor audio in meetings).
  8. Use the chat. Host should monitor it throughout presentations by students in case someone needs to indicate if they have a problem with the connection but do not want to interrupt the speaker.
  9. Provide an informal, shortened practice session for students that you host and monitor in advance of the formal process. This is particularly important if the student is being graded.
  10. Test a back-up plan.

@ESA_org #ESA2020 abstract: The even bigger picture to contemporary scientific syntheses

Background/Question/Methods 

Scientific synthesis is a rapidly evolving field of meta-science pivotal to numerous dimensions of the scientific endeavor and to society at large. In science, meta-analyses, systematic reviews, and evidence mapping are powerful explanatory means to aggregate evidence. However, direct compilation of existing primary evidence is also increasingly common to explore the big picture for pattern and process detection and is used to augment more common synthesis tools. Meta-analyses of primary study literature can be combined with open data assets reporting frequency, distribution, and traits of species. Climate, land-use, and other measures of ecosystem-level attributes can also be derived to support literature syntheses. In society, evidence-based decision making is best served through a diversity of synthesis outcomes in addition to meta-analyses and reviews. The hypothesis tested in this meta-science synthesis is that the diversity of tools and evidence to scientific syntheses has changed in contemporary ecology and environmental sciences to more comprehensively reuse and incorporate evidence for knowledge production. 

Results/Conclusions

Case studies and a formal examination of the scope and extent of the literature reporting scientific synthesis as the primary focus in the environmental sciences and ecology were done. Topically, nearly 700 studies use scientific synthesis in some capacity in these two fields.  Specifically, less than a dozen formally incorporate disparate evidence to connect related concepts. Meta-analyses and formal systematic reviews number at over 5000 publications. Syntheses and aggregations of existing published aggregations are relatively uncommon at less than 10 instances. Reviews, discussions, forums, and notes examining synthesis in these two fields are also frequent at 2500 offerings. Analyses of contemporary subsets of all these publications in the literature identified at least three common themes. Reuse and reproducibility, effect sizes and strength of evidence, and a comprehensive need for linkages to inform decision making. Specific novel tools used to explore derived data for evidence-based decision making in the environmental sciences and ecology included evidence maps, summaries of lessons, identification of flagship studies in the environmental studies that transformed decision making, reporting of sample sizes at many levels that supported effect size calculations, and finally, reporting of a path forward not just for additional research but for application. Collectively, this meta-synthesis of research demonstrated an increasing capacity for diverse scientific syntheses to inform decision making for the environmental sciences.   

Session
Meta-Analysis and Beyond: Applying Big Secondary Data to Environmental Decision-Making

Steps to update a meta-analysis or systematic review

You completed a systematic review or meta-analysis using a formal workflow. You won the lottery in big-picture thinking and perspective. However, time passes with peer review or change (and most fields are prolific even monthly in publishing to your journal set). You need to update the process. Here is a brief workflow for that process.

Updating a search

  1. Revisit the same bibliometrics tool initially used such as Scopus or Web Science.
  2. Record date of current instance and contrast with previous instance documented.
  3. Repeat queries with exact same terms. Use ‘refine’ function or specific ‘timespan’ to year since last search. For instance, last search for an ongoing synthesis was Sept 2019, and we are revising now in Jan 2020. Typically, I use a fuzzy filter and just do 2019-2020. This will generate some overlap. The R package wosr is an excellent resource to interact with Web of Science in all instances and enables reproducibility. The function ‘query_wos’ is fantastic, and you can specify timespan using the argument PY = (2019-2020).
  4. Use a resource that reproducibly enables matching to explore overlaps from first set of studies examined to current updated search. I use the R-package Bibliometrix function ‘duplicatedMatching’, and if there is uncertainty, I then manually check via DOI matching using R code.
  5. Once you have generated your setdiff, examine new entries, collect data, and update both meta-data and primary dataframes.

Implications

  1. Science is rapid, evolving, and upwards of 1200 publications per month are published in some disciplines.
  2. Consider adding a search date to your dataframe. It would be informative to examine the rate that one can update synthesis research.
  3. Repeat formal syntheses, and test whether outcomes are robust.
  4. Examine cumulative meta-analytical statistics.
  5. Ensure your code/workflow for synthesis is resilient to change and replicable through time – you never know how long reviews will take if you are trying to publish your synthesis.