SfN President Steve Hyman
The challenge of scientific rigor is now receiving extensive coverage in scientific and lay media in response to high-profile reports of failures to replicate numerous studies in fields as diverse as cancer biology and social psychology. The rates at which academic studies replicate in industrial settings, especially in preclinical research, has been reported to be far lower than should be expected. This state of affairs has drawn concern from government agencies (including NIH and NSF), journal publishers, and members of Congress. Some colleagues have expressed worries that the attention being paid to scientific rigor carries its own risks of spreading negative views of science; others have pointed to scientific fields that have allegedly spawned “replication vigilantes” who fail to recognize that non-replication is not always a bad sign, but often central to the self-correcting nature of science.
I believe that the Society for Neuroscience, like other major professional organizations, must forthrightly address concerns about scientific rigor and take appropriate steps to support the scientific enterprise and ensure the public’s confidence in our work. To effectively attend to these concerns, SfN convened a Scientific Rigor Working Group in the fall of 2013 to analyze the situation in our field and make recommendations. This working group is giving the issue the serious attention that it deserves and recognizes that its recommendations must be formulated in a constructive manner. Over the past year and a half, the working group has set in motion several initiatives, including developing a set of research practices for scientific rigor and supporting programming on scientific rigor at SfN’s annual meetings. The actions of SfN, executed in the proactive and constructive manner that the working group has adopted, will strengthen the scientific endeavor and improve public appreciation of science.
Current concerns about the inability to replicate findings must be properly distinguished from willful scientific misconduct. Appropriately, the discussion of rigor within SfN is instead largely focused on improving study design, scientific reporting (not only of data, but also methods), and characterization and sharing of reagents. Many concerns about scientific rigor can be dealt with by improving the training of our students and postdocs in these areas. However, we also need to address troubling cultural issues, including pressure from academic departments or funding agencies to publish too rapidly, difficulties in obtaining funding for and publishing replication studies, and obstacles to publishing negative findings. Colleagues also report difficulties in obtaining funding or approval of committees overseeing animal use for adequately powered studies. These issues will require long-term efforts to address policies that have resulted in complicated webs of “perverse incentives” or otherwise put obstacles in the way of studies designed effectively to test worthy hypotheses.
Too often — and obviously with important exceptions — the required courses in responsible conduct of research are taught in a dry manner or treated as unwanted distractions from lab work. In truth, the evolving standards of research and scientific reporting are important and engaging topics that should be central to the education of our students. Trainees should expect — and their programs should provide — appropriate instruction and mentoring in study design and the analysis and interpretation of data.
Individual PIs are crucial to this training process by ensuring that their labs are modeling best practices, such as use of power calculations in designing studies, use of blinding, and a sophisticated appreciation of the strengths and limitations of statistical tests. While clinical trials and much research involving human subjects requires strict attention to study design and significant sophistication in statistics (often with a professional statistician as a member of the team), basic and preclinical research has, as we have been learning, too often treated these matters as secondary.
NIH and SfN are both developing training programs in experimental design, rigor, and reproducibility. NIH will soon release a series of training modules and is seeking to fund online courses in experimental design. For Neuroscience 2015, the Society is planning a symposium on the topic as well as a new short course.
Training can only accomplish so much, however, if young scientists emerge into an incentive system and culture that pushes scientists to hurry their work, skimp on reporting their methods, and publish in a small number of high-profile journals that favor unexpected results — the kind of results least likely to stand the test of time.
The understandable propensity of some scientific journals to emphasize the “exciting” can, when overdone, limit the publication of information critical for replicability, such as methods and detailed information about cell lines and animal models. Also understandable is the challenge for many journals of obtaining expert consultation on design and statistics on all manuscripts for which they are needed. Unless this shortcoming in scientific publishing is addressed, peer reviewers will continue to face pressure to tackle these critically important matters that may lie well outside their comfort zone.
With the launch of eNeuro, SfN aims to alter some of the troubling patterns in publication. The new SfN open-access journal publishes a wide array of content, explicitly including replication studies and negative studies. In addition, both SfN journals, The Journal of Neuroscience and eNeuro, are committed to full and appropriate reporting of studies. Among their intellectual commitments, both journals have signed on to the NIH-led Principles and Guidelines for Reporting Preclinical Research, which outlines principles to facilitate the “interpretation and repetition” of published experiments.
The global scientific community must also find ways to encourage the earlier and more complete sharing of data and reagents, which is vital for scientific progress. Several efforts are underway in neuroscience to increase data sharing and publishing transparency by disambiguating authors and resources. For example, the Resource Identification Initiative is designed to help researchers sufficiently cite the key resources used to produce scientific findings, and ORCID (Open Researcher and Contributor ID) provides a registry of unique researcher identifiers to link individuals and their research activities. Both SfN journals have joined the ORCID system, and The Journal of Neuroscience participated in a three-month pilot program for the Resource Identification Initiative.
A Matter of Principle
While issues of rigor affect the entire scientific community, we in neuroscience often deal with the most complex systems and datasets. Attention to improving study designs, transparency, and sharing of tools and data, along with shared commitments to alter perverse incentives and change culture for the better, is something we owe to our field and especially to our students.