Effort to repeat key cancer biology experiments reveals challenges and opportunities to improve replicability

Dec. 7, 2021

Press kit

Charlottesville, VA — A large-scale systematic investigation to replicate high-impact, preclinical cancer biology experiments identified barriers to conducting replications and observed weaker evidence for the findings compared with the original experiments. Unnecessary friction in the research process may be slowing the advancement of knowledge, solutions, and treatments.

Today, eLife publishes the final outputs of the Reproducibility Project: Cancer Biology, an 8-year effort to replicate experiments from 53 high-impact papers published between 2010 and 2012. Tim Errington, the Director of Research at the Center for Open Science and project leader said: “The purpose of the project was to transparently assess the extent to which there are challenges for conducting replications and obtaining similar evidence of published findings in cancer biology research.”

Launched in 2013, the Reproducibility Project: Cancer Biology was a collaboration between the Center for Open Science, a nonprofit culture change organization with a mission to improve openness, integrity, and reproducibility of research, and Science Exchange, the world’s first online R&D marketplace whose mission is to accelerate scientific discovery. With support from Arnold Ventures (formerly the Laura and John Arnold Foundation), the team conducted a systematic process to select high-impact cancer research papers published between 2010 and 2012. Based on the selection criteria, most of the papers came from high-profile journals such as Nature, Science, and Cell. A total of 193 experiments were selected for replication.

The team designed replication experiments of key findings from each paper by reviewing the methodology and requesting information about protocols and availability of reagents. Then, appropriate expertise for conducting the experiments was sourced through the Science Exchange marketplace. 

For each paper the detailed protocols for the replication experiments were written up as a Registered Report and submitted to eLife for peer review; moreover, work on the replication experiments could not begin until the Registered Report had been accepted for publication. The completed replication experiments were then written up as a Replication Study, peer reviewed and published in eLife. Two of the papers published today are capstone summaries of the entire project.

The first paper “Challenges for Assessing Replicability in Preclinical Cancer Biology” reports on the challenges confronted when preparing and conducting replications of 193 experiments from 53 papers. None of the experiments were described in sufficient detail to design a replication without seeking clarifications from the original authors. Some authors (26%) were extremely helpful and generous with feedback, and some authors (32%) were not at all helpful or did not respond to requests. During experimentation, about two-thirds of the experiments required some modification to the protocols because, for example, model systems behaved differently than originally reported. Ultimately, 50 replication experiments from 23 papers were completed, a small proportion of what were planned. Errington noted “we had challenges at every stage of the research process to design and conduct the replications. It was hard to understand what was originally done, we could not always get access to the original data or reagents to conduct the experiments, and model systems frequently did not behave as originally reported. The limited transparency and incomplete reporting made the efforts to replicate the findings much harder than was necessary.”  

The second paper, “Investigating the Replicability of Preclinical Cancer Biology”, reports a meta-analysis of the results of the 50 replication experiments that did get completed. Many of these experiments involved measuring more than one effect (e.g., measuring the influence of an intervention on both the tumor burden and overall survival), and the 50 experiments that were completed included a total of 158 effects. Most of these effects (136) were reported as positive effects in the original papers, with 22 being reported as null effects. The meta-analysis also had to take into account that 41 of the effects were reported as images rather than as numerical values in the original papers. Replications provided much weaker evidence for the findings compared to the original experiments. For example, for original positive results, replication effect sizes were 85% smaller than the original effect sizes on average. 

The team also used a number of binary criteria to assess whether a replication was successful or not. A total of 112 effects could be assessed by five of these criteria, and 18% succeeded on all five, 15% succeeded on four, 13% succeeded on three, 21% succeeded on two, 13% succeeded on one, and 20% failed on all five. Collectively, 46% of the replications were successful on more criteria than they failed and 54% of the replications failed on more criteria than they succeeded. 

Summarizing, Errington noted “Of the replication experiments we were able to complete, the evidence was much weaker on average than the original findings even though all the replications underwent peer review before conducting the experiments to maximize their quality and rigor. Our findings suggest that there is room to improve replicability in preclinical cancer research.”

Brian Nosek, Executive Director from the Center for Open Science and co-author added “Science is making substantial progress in addressing global health challenges. The evidence from this project suggests that we could be doing even better. There is unnecessary friction in the research process that is interfering with advancing knowledge, solutions, and treatments. Investing in improving transparency, sharing, and rigor of preclinical research could yield huge returns on investment by removing sources of friction and accelerating science. For example, open sharing of data, materials, and code will make it easier to understand, critique, and build upon each other’s work. And, preregistration of experiments and analysis plans will reduce the negative effects of publication bias and distinguish between planned tests and unplanned discoveries.”

These papers identify substantial challenges for cancer research, but they occur amid a reformation in science to address dysfunctional incentives, improve the research culture, increase transparency and sharing, and improve rigor in design and conduct of research. Science is at its best when it confronts itself and identifies ways to improve the quality and credibility of research findings. The Reproducibility Project: Cancer Biology is just one contribution in an ongoing self-examination of research practices and opportunities for improvement.

Supporting Information

Additional supporting information about the project is also available via this OSF link. This includes a fact sheet, background information, a list of independent researchers that have agreed to be listed as possible contacts for interviews, and a guide with links to navigating the content of the RP:CB papers, reviews, and supporting information.

The previously published Registered Reports, Replication Studies, and related commentaries are all available on eLife’s Reproducibility Project: Cancer Biology Collection page, and all data, code, and supporting materials are available in COS’s Reproducibility Project: Cancer Biology Collection. Also summary information about the project and links to key resources are available at cos.io/rpcb


About Center for Open Science

Founded in 2013, COS is a nonprofit technology and culture change organization with a mission to increase openness, integrity, and reproducibility of scientific research. COS pursues this mission by building communities around open science practices, supporting metascience research, and developing and maintaining free, open source software tools, including the Open Science Framework (OSF). For more information, visit cos.io.

About Science Exchange

Founded in 2011 with the goal to accelerate scientific discovery, Science Exchange is an online marketplace powering scientific outsourcing for the R&D industry – providing companies with instant access to scientific services from a trusted network of contract research organizations. Science Exchange's R&D marketplace simplifies scientific outsourcing and eliminates contracting delays so scientists can access innovation without the administrative burdens. Since 2011, Science Exchange has raised more than $70 million from Norwest Venture Partners, Maverick Ventures, Union Square Ventures, Collaborative Fund, Windham Ventures, OATV, the YC Continuity Fund, and others. For more information, visit scienceexchange.com.

About eLife

eLife is a non-profit organisation created by funders and led by researchers. Our mission is to accelerate discovery by operating a platform for research communication that encourages and recognises the most responsible behaviours. We review selected preprints in all areas of biology and medicine, while exploring new ways to improve how research is assessed and published. eLife receives financial support and strategic guidance from the Howard Hughes Medical Institute, the Knut and Alice Wallenberg Foundation, the Max Planck Society and Wellcome. Learn more at elifesciences.org/about.

About Arnold Ventures

Arnold Ventures is a philanthropy dedicated to tackling some of the most pressing problems in the United States. We invest in sustainable change, building it from the ground up based on research, deep thinking, and a strong foundation of evidence. We drive public conversation, craft policy, and inspire action through education and advocacy. For more information, visit arnoldventures.org.


Media Contact Information

  • Tim Errington, tim@cos.io; Project leader, Director of Research at the Center for Open Science. Contact for questions about any aspects of the project, challenges of conducting replications, the meta-analysis of replication outcomes, individual replication experiments and papers, and broader implications of this research for rigor, replicability, and the research culture.
  • Elizabeth Iorns, elizabeth@scienceexchange.com; CEO at Science Exchange. Contact for questions about sourcing of laboratories to conduct experiments, and broader implications of this research for rigor, replicability, and the research culture.
  • Brian Nosek, nosek@cos.io; Executive Director at the Center for Open Science. Contact for questions about challenges of conducting replications, the meta-analysis of replication outcomes, and broader implications of this research for rigor, replicability, and the research culture.

 

Recent News