Innovations for the @ESA_org #ESA2020 annual meeting #FlattenTheCurve #ecology #openscience

The annual meetings of the Ecological Society of America are brilliant. These conventions provide the opportunity to network, collaborate, communicate, connect early-career researchers with new collaborators, and most importantly co-learn and connect as a community. We are scientists, citizens, educators, and a group that can affect social good. The annual meeting this year is in Salt Lake City, Utah entitled ‘Harnessing the ecological data revolution’. However, as citizens, we also have a moral responsibility to #flattenthecurve of a pandemic. In doing so, there is uncertainty on the duration of its spread (and mitigation strategies proposed) with predictions ranging from 6 weeks to many months. We have an opportunity to do social good. Even more importantly, we have an incentive to experiment. After all, we are scientists, and all large meetings certainly come with benefits but also challenges. Navigating a large convention (an app helps), rushing between talks, choosing between concurrent sessions, high relative carbon footprint and other costs of attendance, and now there is a potential moral imperative to consider alternatives. The theme of the meeting this year is also an ideal opportunity to experiment with alternatives. I propose that we test a new way to meet this year. We can even collect data from all participants on the efficacy of the strategies tested, and thus augment and improve future meetings that can reuse some of these approaches whether face-to-face or distributed. A remote meeting with reduced registration costs for instance can provide the following benefits:
a. protect the society from cancellations and loss of entire fees,
b. increase accessibility to those that could not attend because of net costs or other reasons, and
c. broaden our reach outside our community with some components of meeting being made fully accessible.

Innovations

iteminnovationdescription
1virtual poster sessionsThe Entomological Society of America already does this
2pre-recorded short talksReduce all talks to 20 slides, 6 minutes, pre-recorded, and post to an ESA2020 YouTube Channel. Allow individuals to post whatever content they are comfortable providing (deck only, deck with audio or notes, deck with video etc).
3define a common slide deck repositoryF1000, Figshare, and others provide a DOI for items. Benefits include citable objects, use common tags, and set rules for session number in title to enhance discoverability.
4each session co-authors a preprintIdentify two leads per contributed oral session/symposium, and all presenters co-author a short pre-print for session and publish to EcoEvoRxiv
5provide extemporaneous virtual meetingsIt is fantastic to bump into people at meetings. Provide a tool in some format and define mingling times. Every meeting should have this whether in-person or remote. Health, child care, accessibility, costs of food, social pressures to meet in bars or different contexts etc can be addressed with this tool. We can be more inclusive.
6provide a code of conductThis needs to be provided for every meeting in every context. Develop one for this meeting, and we can co-learn as a community have to better respect and enable representation and diversity. Consider a volunteer promote to monitor online discussion and support positive dialogue at all times.
7stream live workshopsUse Zoom, Twitch, or any platform to enable and stream live coding, writing, training, and discovery events.
8capitalize on social media (appropriately)Use social media more effectively. Pre-designate tags, run a photo contest on Instagram from your study work that you present, consider a code snippet sharing event, run a cartoon or outreach contest.
9hackathons or kagglesThis meeting is about the data revolution. Define and plan a kaggle or two and a few hackathons. We can collaborate and make the presentation process less one-way and fixed and much more interactive.
10publish data nowThe ESA can use this meeting as a means to advance the revolution by providing incentives to provide whatever scientific products individuals are able to provide (data, data snippets, code, workflows, syntheses, etc) at their points in career, degree, or research process.

Tips for student reports using online conferencing tools

Video and web conferencing tools will be used for upcoming undergraduate and graduate student progress reports at various institutions beginning this week. With my collaborators, graduate students will be using Zoom specifically to run their annual progress reports that comprise a 12-15 minute presentation followed by discussion and questions. Undergraduate thesis researchers will also present their thesis research and field questions to complete their course requirements. Previously, I have tried to best understand how to use these tools including recordings to present mini-conference talks, but the goal was primarily rapid, informal communication with limited needed for dialogue. Consequently, I have been considering how to best the support the team in the next few weeks through more effective use of video and web conferencing. Through trial-and-error this previous week, here are some ideas to consider if you are about to employ similar tools.

Tips

  1. Set up the conferencing session with a buffer of at least 15 additional minutes.
  2. Log in early, and test audio and video. I prefer headphones with mic to avoid reverberated sound. The presenter should also test turning on and off screen sharing.
  3. Check the settings when you set up the meeting. Confirm that you want participants to be able to log in muted and/or with video and in advance of the host. I choose muted entry and allow log in before host just in case this person is late.
  4. Designate a host for a meeting. With Zoom, only the host can record, and this can be handy (at least for the saved chat if not video).
  5. Discuss with all participants the rules of conduct briefly for the meeting (including whether recorded or not), and the host should introduce each participant in the conference with a brief hello or response from each to ensure everyone is seen or heard (and also that the settings worked).
  6. Mute yourself when not speaking.
  7. Consider turning off video or at least discuss because this can be very distracting to the presenters. These is also the remote possibility that this can improve sound quality (apparently participants tolerate poor to no video but not poor audio in meetings).
  8. Use the chat. Host should monitor it throughout presentations by students in case someone needs to indicate if they have a problem with the connection but do not want to interrupt the speaker.
  9. Provide an informal, shortened practice session for students that you host and monitor in advance of the formal process. This is particularly important if the student is being graded.
  10. Test a back-up plan.

@ESA_org #ESA2020 abstract: The even bigger picture to contemporary scientific syntheses

Background/Question/Methods 

Scientific synthesis is a rapidly evolving field of meta-science pivotal to numerous dimensions of the scientific endeavor and to society at large. In science, meta-analyses, systematic reviews, and evidence mapping are powerful explanatory means to aggregate evidence. However, direct compilation of existing primary evidence is also increasingly common to explore the big picture for pattern and process detection and is used to augment more common synthesis tools. Meta-analyses of primary study literature can be combined with open data assets reporting frequency, distribution, and traits of species. Climate, land-use, and other measures of ecosystem-level attributes can also be derived to support literature syntheses. In society, evidence-based decision making is best served through a diversity of synthesis outcomes in addition to meta-analyses and reviews. The hypothesis tested in this meta-science synthesis is that the diversity of tools and evidence to scientific syntheses has changed in contemporary ecology and environmental sciences to more comprehensively reuse and incorporate evidence for knowledge production. 

Results/Conclusions

Case studies and a formal examination of the scope and extent of the literature reporting scientific synthesis as the primary focus in the environmental sciences and ecology were done. Topically, nearly 700 studies use scientific synthesis in some capacity in these two fields.  Specifically, less than a dozen formally incorporate disparate evidence to connect related concepts. Meta-analyses and formal systematic reviews number at over 5000 publications. Syntheses and aggregations of existing published aggregations are relatively uncommon at less than 10 instances. Reviews, discussions, forums, and notes examining synthesis in these two fields are also frequent at 2500 offerings. Analyses of contemporary subsets of all these publications in the literature identified at least three common themes. Reuse and reproducibility, effect sizes and strength of evidence, and a comprehensive need for linkages to inform decision making. Specific novel tools used to explore derived data for evidence-based decision making in the environmental sciences and ecology included evidence maps, summaries of lessons, identification of flagship studies in the environmental studies that transformed decision making, reporting of sample sizes at many levels that supported effect size calculations, and finally, reporting of a path forward not just for additional research but for application. Collectively, this meta-synthesis of research demonstrated an increasing capacity for diverse scientific syntheses to inform decision making for the environmental sciences.   

Session
Meta-Analysis and Beyond: Applying Big Secondary Data to Environmental Decision-Making

A vision statement describing goals for Ecology @ESAEcology #openscience

Many aspects of the journal Ecology are exceptional.  It is a society journal and that is important. The strength of research, depth of reporting, and scope of primary ecological research that informs and shapes fundamental theory has been profound.  None of these benefits need to change.  Nonetheless, research that supports the scientific process and engenders discovery can always evolve and must be fluent.  So must the process of scientific communication including publications through journals.  With collaborators and support from NCEAS and a large publishing company, I have participated in meta-science research examining needs and trends in the process of peer review for ecologists and evolutionary biologists, i.e. Behind the shroud: a survey of editors in ecology and evolution published in Frontiers in Ecology and the Environment or biases in peer review such as Systematic Variation in Reviewer Practice According to Country and Gender in the Field of Ecology and Evolution published in PLOS ONE.  In total, we have published 50 peer-reviewed publications describing a path forward for ecology and evolution in particular with respect to inclusivity, open science, and journal policy.  Ideally, we have identified at least three salient elements for journals relevant to authors, referees, and editors, and four pillars for a future for scholarly publishing more broadly.  The three elements for Ecology specifically would be speed, recognition, and more full and reproducible reporting.  The four pillars include an ecosystem of products, open access, open or better peer review, and recognition for participation in the process .

 

Goals to consider

  1. Rapid peer review with no more than 4 weeks total for first decision.
  2. A 50% editor-driven rejection rate of initial submissions.
  3. Two referees per submission if in agreement (little to no evidence more individuals are required).
  4. Double the 2017 impact factor to ~10 within 2 years and return to top 10 ranking in 160 of journals listed in field of ecology.
  5. Further diversify the contributions to address exploration, confirmation, replication, consolidation, & synthesis.
  6. Innovate content offering to encompass more elements of the scientific process including design, schemas, workflows, ideation tools, data models, ontologies, and challenges.
  7. Allow authors to report failure and bias in process and decision making for empirical contributions.
  8. Provide additional novel material from every publication as free content even when behind paywall.
  9. Develop a collaborative reward system for the editorial board that capitalizes on existing expertise and produces novel scientific content such as editorials, commentaries, and the reviews as outwardly facing products. Include and invite referees to participate in these ‘meta’ papers because reviews are a form of critical and valuable synthesis.
  10. Promote a vision of scientific synthesis in every publication in the Discussion section of reports. Request an effect size measure for reports to provide an anchor for future reuse (i.e. use the criteria proposed in ‘Will your paper be used in a meta‐analysis? Make the reach of your research broader and longer lasting’).
  11. Revise the data policy to require data deposition – at least in some form such as derived data – openly prior to final acceptance but not necessarily for initial submission.
  12. Request access to code and data for review process.
  13. Explore incentives for referees – this is a critical issue for many journals. Associate reviews with Publons or ORCID.
  14. Emulate the PeerJ model for badges and profiles for editors, authors, and
  15. Remove barriers for inclusivity of authors through double-blind review.
  16. Develop an affirmative action and equity statement for existing publications and submissions to promote diversity through elective declaration statements and policy changes.
  17. All editors must complete awareness training for implicit bias. Editors can also be considered for certification awarded by the ESA based on merit of reviewing such as volume, quality of reviews, and service. Recognition and social capital are important incentives.
  18. Develop an internship program for junior scientists to participate in the review and editorial process.
  19. Explore reproducibility through experimental design and workflow registration with the submission process.
  20. Remove cover letters as a requirement for submission.

Outcomes

I value our community and the social good that our collective research, publications, and scientific outcomes provide for society.  However, I am also confident that we can do more.  Journals and the peer review process can function to illuminate the scientific process and peer review including addressing issues associated with reproducibility in science and inclusivity.  Know better, do better.  It is time for scientific journals to evolve, and the journal Ecology can be a flagship for change that benefits humanity at large by informing evidence-based decision making and ecological literacy.

 

Hacking the principles of #openscience #workshops

In a previous post, I discussed the key elements that really stood out for me in recent workshops associated with open science, data science, and ecology. Summer workshop season is upon us, and here are some principles to consider that can be used to hack a workshop. These hacks can be applied a priori as an instructor or in situ as a participant or instructor by engaging with the context from a pragmatic, problem-solving perspective.

Principles

1. Embrace open pedagogy.
2. Use and current best practices from traditional teaching contexts.
3. Be learner centered.
4. Speak less, do more.
5. Solve authentic challenges.

Hacks (for each principle)

1. Prepare learning outcomes for every lesson.

2. Identify solve-a-problem opportunities in advance and be open to ones that emerge organically during the workshop.

3. Use no slide decks. This challenges the instructor to more directly engage with the students and participants in the workshop and leaves space for students to shape content and narrative to some extent. Decks lock all of us in. This is appropriate for some contexts such as conference presentations, but workshops can be more fluid and open.

4. Plan pauses. Prepare your lessons with gaps for contributions.  Prepare a list of questions to offer up for every lesson and provide time for discussion of solutions.

5. Use real evidence/data to answer a compelling question (scale can be limited, approach beta as long as an answer is provided, and the challenge can emerge if teaching is open and space provided for the workshop participants to ideate).

Final hack that is a more general teaching principle, consider keeping all teaching materials within a single ecosystem that then references outwards only as needed. For me, this has become all content prepared in RStudio, knitted to html, then pushed to GitHub gh-pages for sharing as a webpage (or site). Then participants can engage in all ideas and content including code, data, ideas in one place.