A vision statement describing goals for Ecology @ESAEcology #openscience

Many aspects of the journal Ecology are exceptional.  It is a society journal and that is important. The strength of research, depth of reporting, and scope of primary ecological research that informs and shapes fundamental theory has been profound.  None of these benefits need to change.  Nonetheless, research that supports the scientific process and engenders discovery can always evolve and must be fluent.  So must the process of scientific communication including publications through journals.  With collaborators and support from NCEAS and a large publishing company, I have participated in meta-science research examining needs and trends in the process of peer review for ecologists and evolutionary biologists, i.e. Behind the shroud: a survey of editors in ecology and evolution published in Frontiers in Ecology and the Environment or biases in peer review such as Systematic Variation in Reviewer Practice According to Country and Gender in the Field of Ecology and Evolution published in PLOS ONE.  In total, we have published 50 peer-reviewed publications describing a path forward for ecology and evolution in particular with respect to inclusivity, open science, and journal policy.  Ideally, we have identified at least three salient elements for journals relevant to authors, referees, and editors, and four pillars for a future for scholarly publishing more broadly.  The three elements for Ecology specifically would be speed, recognition, and more full and reproducible reporting.  The four pillars include an ecosystem of products, open access, open or better peer review, and recognition for participation in the process .


Goals to consider

  1. Rapid peer review with no more than 4 weeks total for first decision.
  2. A 50% editor-driven rejection rate of initial submissions.
  3. Two referees per submission if in agreement (little to no evidence more individuals are required).
  4. Double the 2017 impact factor to ~10 within 2 years and return to top 10 ranking in 160 of journals listed in field of ecology.
  5. Further diversify the contributions to address exploration, confirmation, replication, consolidation, & synthesis.
  6. Innovate content offering to encompass more elements of the scientific process including design, schemas, workflows, ideation tools, data models, ontologies, and challenges.
  7. Allow authors to report failure and bias in process and decision making for empirical contributions.
  8. Provide additional novel material from every publication as free content even when behind paywall.
  9. Develop a collaborative reward system for the editorial board that capitalizes on existing expertise and produces novel scientific content such as editorials, commentaries, and the reviews as outwardly facing products. Include and invite referees to participate in these ‘meta’ papers because reviews are a form of critical and valuable synthesis.
  10. Promote a vision of scientific synthesis in every publication in the Discussion section of reports. Request an effect size measure for reports to provide an anchor for future reuse (i.e. use the criteria proposed in ‘Will your paper be used in a meta‐analysis? Make the reach of your research broader and longer lasting’).
  11. Revise the data policy to require data deposition – at least in some form such as derived data – openly prior to final acceptance but not necessarily for initial submission.
  12. Request access to code and data for review process.
  13. Explore incentives for referees – this is a critical issue for many journals. Associate reviews with Publons or ORCID.
  14. Emulate the PeerJ model for badges and profiles for editors, authors, and
  15. Remove barriers for inclusivity of authors through double-blind review.
  16. Develop an affirmative action and equity statement for existing publications and submissions to promote diversity through elective declaration statements and policy changes.
  17. All editors must complete awareness training for implicit bias. Editors can also be considered for certification awarded by the ESA based on merit of reviewing such as volume, quality of reviews, and service. Recognition and social capital are important incentives.
  18. Develop an internship program for junior scientists to participate in the review and editorial process.
  19. Explore reproducibility through experimental design and workflow registration with the submission process.
  20. Remove cover letters as a requirement for submission.


I value our community and the social good that our collective research, publications, and scientific outcomes provide for society.  However, I am also confident that we can do more.  Journals and the peer review process can function to illuminate the scientific process and peer review including addressing issues associated with reproducibility in science and inclusivity.  Know better, do better.  It is time for scientific journals to evolve, and the journal Ecology can be a flagship for change that benefits humanity at large by informing evidence-based decision making and ecological literacy.


Ecological network flavors: many-to-many, few-to-many, and few-to-many spatially

Recent conference attendance inspired me to do a quick typology of networks that were presented in various talks. All were done in R using a few different packages.
All were interested in diversity patterns.
None were food webs.


many-to-many: many plant species and many pollinators for instance

few-to-many: mapping the associated set of pollinators to one flowering species

few-to-many: replicated mapping of diversity for one taxa to a single species of another either nested or spatially contrasted.


Network analyses are amazing. I need to learn more!

Can you also map interactions onto other interactions?



Sharing strategies for #ESA2016 #openscience #scicomm

Meetings are an excellent opportunity to not only communicate your science but secure feedback. I propose the more you give, the more get.


There are at least the following five open-science products associated with any contribution (presentation or poster) to share with your colleagues and a much wider online audience prior to the meeting.  [ green text = hyperlinks ]

Open-science products to share for a meeting

  1. The slide deck or poster can be published on SlideShare.
  2. Your data-science workflow, code, and EDA can be published as an r-markdown on GitHub.
  3. The primary or derived summary data (if you are not ready to go public yet) can be published on figshare (and/or included in GitHub repo).
  4. Most journals accept submissions that have been pre-printed. Consider sharing your draft paper on PeerJ or bioRxiv. Not at that stage? Do a blog post instead.
  5. Record a video abstract to share the main finding of your talk and post to YouTube or Vimeo. This could attract a larger audience to the conference presentation and does an incredibly useful service in communicating science to the public and others that do not attend the conference.


I enjoy the process of science way too much, and I am easily distracted.  How did I do in preparing for my ESA talk this year on microenvironmental change under desert shrubs?  I scored a total of 4 our of 5 . Feel free to click on bolded text below to see materials and provide feedback. Each is absolutely a work-in-progress like the experiment itself (we need at least one more year of data). However, I am hoping it is a good time to share ideas now and see if we can do better next year in the field.

Deck on SlideShare

Code on GitHub

Data on GitHub

Video abstract (went a bit crazy here and did two). Field and in office versions. Very high cheese factor in both (hard to be natural on camera).

Science is a process. Share your steps.





#rstudio #github missing command lines for mac setup @rstudio @github @swcarpentry

Every few months, I try to do a clean install on my machine. I know that OS X Sierra is due out in September, but I elected to do a wipe and clean install now for the remainder of summer.


Wipe, reinstall OSX from usb, brief minor hack/tweaks, then just a few apps including base-r and rstudio. I prefer to connect to github without desktop app and use rstudio directly.

Limitation, I forgot two little things that consumed forever to get rstudio and github to connect. So, if you are a mac user too, here is a synopsis.


Most steps well articulated online
#open terminal/shell.
git config –global user.name “your_username”
git config –global user.email “your_email@example.com”

#missing 1 for macs: tell osx keychain to store password
git config –global credential.helper osxkeychain

#generate SSH RSA key via command line
ssh-keygen -t rsa -C “your_email@example.com”

#alternatively, you can do via rstudio tools/global options/enable version control
#then create RSA key, save, copy, and paste over to your github account online.

#check authentication works
ssh -T git@github.com

#missing 2 for macsdo a command line push to get password into osxkeychain
#I tried clone/new repo, make changes, commit, then push, and failed because no password to push changes via version control to github was stored and rstudio does not talk to keychain #frustrating
#so make/clone a repo, generate a change, and then do push from command line

git push -u origin gh-pages


git push -u origin master

#depending on branch name

#I hope this note-to-self provides you with the missing lines you need to get your next level too!



The Wardle Test for a #socialmedia #selfie effect in science


‘And I am immortal’ (through social media).
Connor MacLeod (The Highlander).

A recent paper in the journal Ideas in Ecology and Evolution inspired me to rethink/temper my optimism in social media as a panacea for effective scientific communication. The running title of the paper, how to tweet your way to honour and glory, by David Wardle captures several primary concerns with altmetrics as a tool to estimate merit, value, or even global reach. We are discussing these ideas at NCEAS today, and as a heuristic, I prepared the following deckumentary (commentary + slide deck). The strengths and limitations of social media as a tool to communicate science are explored.  Several basic solutions are proposed. However, there is an incredible opportunity here to more throughly examine how we handle social media as a tool and evaluate its capacity for effective outreach.

One of the highlights proposed in the article that I really enjoyed but want to emphasize more directly here is the test of a particular potential limitation – non-independence of outreach from the social media stream of the producer.  I propose we should entitle the test developed The Wardle Test for a social-media selfie effect in science.

The social-media selfie effect workflow

  1. Select a set of products with different authors but from a similar outlet (i.e. a journal).
  2. Structure sampling of products to ensure reproducibility (i.e. regular, random, or random-stratified sampling from the outlet), and ensure author-identities are unique in each instance.
  3. Record altmetric scores reported for each product.
  4. Capture twitter-stream for each product.
  5. Assign tweets to product producer (rule: personal twitter account matches first author or organization such as lab) or other (potentially independent twitter account).
  6. Contrast altmetric scores between products tweeted by producers relative to others.

Fantastic idea as a proxy for the positive and negative ‘echo-chamber’ effect discussed widely online. We need an r-script to scrape a larger set of products and associated accounts!

Then, can can calculate not only this social-media selfie effect but also explore some of the contemporary analytical solutions produced online by many ‘influence’ indices including diversifying the signal analysis, weighting (often by audience), and normalization.

The ‘quickening’ of social media amplification is perhaps not immortal, but it is a challenge and thus opportunity for scientific communicators and critical citizens to better validate and use this effect appropriately.