Scientific progress depends upon the accumulation of empirical knowledge via reproducible methodology. Although reproducibility is a main tenet of the scientific method, recent studies have highlighted widespread failures in adherence to this ideal. The goal of this study was to gauge the level of computational reproducibility, or the ability to obtain the same results using the same data and analytic methods as in the original publication, in the field of wildlife science. We randomly selected 80 papers published in the Journal of Wildlife Management and Wildlife Society Bulletin between 1 June 2016 and 1 June 2018. Of those that were suitable for reproducibility review (n = 74), we attempted to obtain study data from online repositories or directly from authors. Forty-two authors did not respond to our requests, and we were further unable to obtain data from authors of 13 other studies. Of the 19 studies for which we were able to obtain data and complete our analysis, we judged that 13 were mostly or fully reproducible. We conclude that the studies with publicly available data or data shared upon request were largely reproducible, but we remain concerned about the difficulty in obtaining data from recently published papers. We recommend increased data-sharing, data organization and documentation, communication, and training to advance computational reproducibility in the wildlife sciences.
Bibliographical notePublisher Copyright:
© 2020 The Authors. The Journal of Wildlife Management published by Wiley Periodicals, Inc. on behalf of The Wildlife Society
- data sharing
- open science
- research methods
- statistical methods