Innovations for the @ESA_org #ESA2020 annual meeting #FlattenTheCurve #ecology #openscience

The annual meetings of the Ecological Society of America are brilliant. These conventions provide the opportunity to network, collaborate, communicate, connect early-career researchers with new collaborators, and most importantly co-learn and connect as a community. We are scientists, citizens, educators, and a group that can affect social good. The annual meeting this year is in Salt Lake City, Utah entitled ‘Harnessing the ecological data revolution’. However, as citizens, we also have a moral responsibility to #flattenthecurve of a pandemic. In doing so, there is uncertainty on the duration of its spread (and mitigation strategies proposed) with predictions ranging from 6 weeks to many months. We have an opportunity to do social good. Even more importantly, we have an incentive to experiment. After all, we are scientists, and all large meetings certainly come with benefits but also challenges. Navigating a large convention (an app helps), rushing between talks, choosing between concurrent sessions, high relative carbon footprint and other costs of attendance, and now there is a potential moral imperative to consider alternatives. The theme of the meeting this year is also an ideal opportunity to experiment with alternatives. I propose that we test a new way to meet this year. We can even collect data from all participants on the efficacy of the strategies tested, and thus augment and improve future meetings that can reuse some of these approaches whether face-to-face or distributed. A remote meeting with reduced registration costs for instance can provide the following benefits:
a. protect the society from cancellations and loss of entire fees,
b. increase accessibility to those that could not attend because of net costs or other reasons, and
c. broaden our reach outside our community with some components of meeting being made fully accessible.

Innovations

iteminnovationdescription
1virtual poster sessionsThe Entomological Society of America already does this
2pre-recorded short talksReduce all talks to 20 slides, 6 minutes, pre-recorded, and post to an ESA2020 YouTube Channel. Allow individuals to post whatever content they are comfortable providing (deck only, deck with audio or notes, deck with video etc).
3define a common slide deck repositoryF1000, Figshare, and others provide a DOI for items. Benefits include citable objects, use common tags, and set rules for session number in title to enhance discoverability.
4each session co-authors a preprintIdentify two leads per contributed oral session/symposium, and all presenters co-author a short pre-print for session and publish to EcoEvoRxiv
5provide extemporaneous virtual meetingsIt is fantastic to bump into people at meetings. Provide a tool in some format and define mingling times. Every meeting should have this whether in-person or remote. Health, child care, accessibility, costs of food, social pressures to meet in bars or different contexts etc can be addressed with this tool. We can be more inclusive.
6provide a code of conductThis needs to be provided for every meeting in every context. Develop one for this meeting, and we can co-learn as a community have to better respect and enable representation and diversity. Consider a volunteer promote to monitor online discussion and support positive dialogue at all times.
7stream live workshopsUse Zoom, Twitch, or any platform to enable and stream live coding, writing, training, and discovery events.
8capitalize on social media (appropriately)Use social media more effectively. Pre-designate tags, run a photo contest on Instagram from your study work that you present, consider a code snippet sharing event, run a cartoon or outreach contest.
9hackathons or kagglesThis meeting is about the data revolution. Define and plan a kaggle or two and a few hackathons. We can collaborate and make the presentation process less one-way and fixed and much more interactive.
10publish data nowThe ESA can use this meeting as a means to advance the revolution by providing incentives to provide whatever scientific products individuals are able to provide (data, data snippets, code, workflows, syntheses, etc) at their points in career, degree, or research process.

Tips for student reports using online conferencing tools

Video and web conferencing tools will be used for upcoming undergraduate and graduate student progress reports at various institutions beginning this week. With my collaborators, graduate students will be using Zoom specifically to run their annual progress reports that comprise a 12-15 minute presentation followed by discussion and questions. Undergraduate thesis researchers will also present their thesis research and field questions to complete their course requirements. Previously, I have tried to best understand how to use these tools including recordings to present mini-conference talks, but the goal was primarily rapid, informal communication with limited needed for dialogue. Consequently, I have been considering how to best the support the team in the next few weeks through more effective use of video and web conferencing. Through trial-and-error this previous week, here are some ideas to consider if you are about to employ similar tools.

Tips

  1. Set up the conferencing session with a buffer of at least 15 additional minutes.
  2. Log in early, and test audio and video. I prefer headphones with mic to avoid reverberated sound. The presenter should also test turning on and off screen sharing.
  3. Check the settings when you set up the meeting. Confirm that you want participants to be able to log in muted and/or with video and in advance of the host. I choose muted entry and allow log in before host just in case this person is late.
  4. Designate a host for a meeting. With Zoom, only the host can record, and this can be handy (at least for the saved chat if not video).
  5. Discuss with all participants the rules of conduct briefly for the meeting (including whether recorded or not), and the host should introduce each participant in the conference with a brief hello or response from each to ensure everyone is seen or heard (and also that the settings worked).
  6. Mute yourself when not speaking.
  7. Consider turning off video or at least discuss because this can be very distracting to the presenters. These is also the remote possibility that this can improve sound quality (apparently participants tolerate poor to no video but not poor audio in meetings).
  8. Use the chat. Host should monitor it throughout presentations by students in case someone needs to indicate if they have a problem with the connection but do not want to interrupt the speaker.
  9. Provide an informal, shortened practice session for students that you host and monitor in advance of the formal process. This is particularly important if the student is being graded.
  10. Test a back-up plan.

@ESA_org #ESA2020 abstract: The even bigger picture to contemporary scientific syntheses

Background/Question/Methods 

Scientific synthesis is a rapidly evolving field of meta-science pivotal to numerous dimensions of the scientific endeavor and to society at large. In science, meta-analyses, systematic reviews, and evidence mapping are powerful explanatory means to aggregate evidence. However, direct compilation of existing primary evidence is also increasingly common to explore the big picture for pattern and process detection and is used to augment more common synthesis tools. Meta-analyses of primary study literature can be combined with open data assets reporting frequency, distribution, and traits of species. Climate, land-use, and other measures of ecosystem-level attributes can also be derived to support literature syntheses. In society, evidence-based decision making is best served through a diversity of synthesis outcomes in addition to meta-analyses and reviews. The hypothesis tested in this meta-science synthesis is that the diversity of tools and evidence to scientific syntheses has changed in contemporary ecology and environmental sciences to more comprehensively reuse and incorporate evidence for knowledge production. 

Results/Conclusions

Case studies and a formal examination of the scope and extent of the literature reporting scientific synthesis as the primary focus in the environmental sciences and ecology were done. Topically, nearly 700 studies use scientific synthesis in some capacity in these two fields.  Specifically, less than a dozen formally incorporate disparate evidence to connect related concepts. Meta-analyses and formal systematic reviews number at over 5000 publications. Syntheses and aggregations of existing published aggregations are relatively uncommon at less than 10 instances. Reviews, discussions, forums, and notes examining synthesis in these two fields are also frequent at 2500 offerings. Analyses of contemporary subsets of all these publications in the literature identified at least three common themes. Reuse and reproducibility, effect sizes and strength of evidence, and a comprehensive need for linkages to inform decision making. Specific novel tools used to explore derived data for evidence-based decision making in the environmental sciences and ecology included evidence maps, summaries of lessons, identification of flagship studies in the environmental studies that transformed decision making, reporting of sample sizes at many levels that supported effect size calculations, and finally, reporting of a path forward not just for additional research but for application. Collectively, this meta-synthesis of research demonstrated an increasing capacity for diverse scientific syntheses to inform decision making for the environmental sciences.   

Session
Meta-Analysis and Beyond: Applying Big Secondary Data to Environmental Decision-Making

Steps to update a meta-analysis or systematic review

You completed a systematic review or meta-analysis using a formal workflow. You won the lottery in big-picture thinking and perspective. However, time passes with peer review or change (and most fields are prolific even monthly in publishing to your journal set). You need to update the process. Here is a brief workflow for that process.

Updating a search

  1. Revisit the same bibliometrics tool initially used such as Scopus or Web Science.
  2. Record date of current instance and contrast with previous instance documented.
  3. Repeat queries with exact same terms. Use ‘refine’ function or specific ‘timespan’ to year since last search. For instance, last search for an ongoing synthesis was Sept 2019, and we are revising now in Jan 2020. Typically, I use a fuzzy filter and just do 2019-2020. This will generate some overlap. The R package wosr is an excellent resource to interact with Web of Science in all instances and enables reproducibility. The function ‘query_wos’ is fantastic, and you can specify timespan using the argument PY = (2019-2020).
  4. Use a resource that reproducibly enables matching to explore overlaps from first set of studies examined to current updated search. I use the R-package Bibliometrix function ‘duplicatedMatching’, and if there is uncertainty, I then manually check via DOI matching using R code.
  5. Once you have generated your setdiff, examine new entries, collect data, and update both meta-data and primary dataframes.

Implications

  1. Science is rapid, evolving, and upwards of 1200 publications per month are published in some disciplines.
  2. Consider adding a search date to your dataframe. It would be informative to examine the rate that one can update synthesis research.
  3. Repeat formal syntheses, and test whether outcomes are robust.
  4. Examine cumulative meta-analytical statistics.
  5. Ensure your code/workflow for synthesis is resilient to change and replicable through time – you never know how long reviews will take if you are trying to publish your synthesis.

Vision statement for Ecological Applications @ESAApplications journal and ideas for @ESA_org

Philosophy. Know better, do better.

Applied science has an obligation to engender social good. These pathways can include knowledge mobilization, mode-2 scientific production, transparency, addressing the reproducibility crisis in science, promoting diversity and equity through representation, and enabling discovery through both theory and application of ecological principles. Evidence-based decision making can leverage the work published in Ecological Applications. However, evidence-informed decision making that uses ecological principles and preliminary evidence as a means to springboard ideas and more rapidly respond to global challenges are also needed. Science is not static, and the frame-rate of changes and challenges is exceptionally rapid. We cannot always (ever) afford to wait for sufficient, deep evidence, and in ecological applications, we need to share what clearly works, what can work, and finally also what did not work. This is a novel paradigm for publishing in a traditional journal. We are positioned with innovations in ecology such as more affordable sensor technology, R, citizen science, novel big data streams from the Earth Sciences, and team science to provide insight-level data and update data and findings over time. An applied journal need not become full open access or all open science practice based (although we must strive for these ideals), but instead provide at least some capacity within the journal to interact with policy, decision processes, and dialogue to promote the work published and to advance societal knowledge.

Proposed goals: content

  1. Leverage the ‘communications’ category of publications to hone insights in the field and advance insights that are currently data limited.
  2. Invite stakeholders and policy practioners to more significantly contribute to communications reacting and responding to evidence and highlighting evidence (similar to the ‘letters to the editor’) from a constructive and needs-based perspective.
  3. Provide the capacity for authors of article publications to update contributions with a new category of paper entitled ‘application updates’.
  4. Look to other applied science journals such as Cell for insights. This journal for instance includes reviews, perspectives, and primers as contributions. It also has a strong thematic and special issue focus to organize content.
  5. In addition to an Abstract, further develop the public text box model to describe highlights, challenges, and next steps for every article.
  6. Expand the breadth of the ‘open research’ section of contributions to include code, workflows, field methods, photographs, or any other research product that enables reproducibility.
  7. Explore a mechanism to share applications that were unsuccessful or emerging but not soundly confirmed.
  8. Explore a new ‘short synthesis’ contribution format that examines aggregated evidence. This can include short-format reviews, meta-analyses, systematic reviews, evidence maps, description of new evidence sets that support ecological applications or policy, and descriptions of compiled qualitative evidence for a contempoary challenge.

Proposed goals: process

  1. Accelerate handling time (currently, peer-review process suggests three weeks for referees). Reduce editor review time to 2 weeks and referee turnaround time to 2 weeks.
  2. Remove formatting requirements for initial submission.
  3. Remove cover letter requirement. Instead, include a short form in ScholarOne submission system that provides three brief fields to propose implications wherein the authors propose why a specific contribution is a good fit for this journal.
  4. Allow submission of a single review solicited by the authors. This review must be signed and does not count toward journal review process but can be a brilliant mechanism to inform editor-level review.
  5. Data must be made available at the time of submission. This can be a private link to data or published in a repository with limited access until acceptance. It is so useful to be able to ‘see’ data, literally, in table format to understand how and what was interpreted and presented.
  6. Consider double-blind review.
  7. Develop more anchors or hooks in papers that can reused and leveraged for policy. This can include specific reporting requirements such as plot/high-level sample sizes (N), total sample sizes of subjects (n), clear reporting of variance, and where possible, an effect size metric even as simple as the net percent change of the primary intervention or application.
  8. The current offerings are designated by contribution type such as article, letter, etc. However, once viewing a paper, the reader must best-guess based on title, abstract, and keywords how this paper contributes to application. A system of simple badges that visually signals to readers and those these seek to reuse content what a paper addresses. These badges can be placed above the title alongside the access and licensing designation badges. Categories of badges can include an icon for biome/ecosystem, methods, R or code used, immediately actionable, mode-2 collaboration, and theory.
  9. Expand SME board further. Consider accept without review as mechanism to fast track contributions that are critical and the most relevant. This would include an editor-only exceptionally rapid review process.
  10. Engage with ESA, other journals, and community to develop and offer more needs-driven special issues.
Landscapes are changing and people are always part of the picture.
Science is an important way of knowing and interacting with natural systems.
Not everything needs fixing.