Assessing Research Practice

Research on research practices to identify where greater openness, rigor, and transparency can strengthen the research lifecycle.

We assess research practices to understand the strengths and weaknesses of the research process and culture, and to identify opportunities for innovation to improve rigor and accelerate discovery via lifecycle open science—research that is more open, transparent, and connected across the research lifecycle.

Featured Projects

Systematizing Confidence in Open Research and Evidence (SCORE)

SCORE assessed the credibility of research claims through large-scale reproducibility, robustness, and replication tests of hundreds of published findings from the social and behavioral sciences, alongside human and machine assessments of replicability. The collection, to be published in Nature in 2026, provides a broad empirical look at how research claims fare under different forms of scrutiny and what scalable assessment methods may contribute to evaluating trustworthiness.

Replicability Project: Health Behavior

A large-scale, crowdsourced project to examine the replicability of published findings in health research. The project aims to assess the robustness of findings and promote best practices in research transparency, methodological rigor, and reproducibility in support of a more transparent and trustworthy foundation for health research.

Ongoing Activities

These ongoing activities reflect current areas of work, including active research, pilots, and partnerships designed to generate evidence, test new approaches, and inform future innovation.

Benchmarking LLM Agents Doing Science

Developing a framework for evaluating LLM agents ability to perform components of the research process. Exploring the opportunities and limitations of LLM agents supporting rigorous and trustworthy research practices.

Open Scholarship Survey (OSS)

An open, modular survey assessing attitudes, behaviors, perceptions, and blockers for adopting open science practices. We administer the survey and provide reports for research supporting organizations to understand the state of practice in their community and opportunities for improving engagement with open science.

OSF as a Data Source for Metascience Research

OSF is a rich resource of research behavior beyond the published article. Metascientists can employ the public API to conduct investigations of the research process.

Archived Activities

These archived activities reflect a longer portfolio of metascience research examining credibility, transparency, and innovative methods across different fields and methods. While these projects are no longer active, they continue to inform COS’s current research, products, and solutions.

Reproducibility Project: Cancer Biology

Replications of 193 papers and 53 findings from high-impact pre-clinical cancer biology papers. Published in eLife in 2021. Replication meta-analysis (Errington et al., 2021a). Challenges conducting replications (Errington et al., 2021b). Journal collection of papers. OSF collection of data, materials, and code.

Reproducibility Project: Psychology

Crowd-sourced replications of 100 published findings in psychology. Published in Science (Open Science Collaboration, 2015). Paper. Open Access Paper. Data, Materials, and Code.

Many Labs

Crowd-sourced replications of the same findings in several laboratories to assess heterogeneity in findings and potential explanations for variation in replicability. 1: Klein et al., 2014, paper, data; 2: Klein et al., 2018, paper, data; 3: Ebersole et al., 2016, paper, data; 4: Klein et al., 2022, paper, data; 5: Ebersole et al., 2020, paper, data.

Many Analysts

29 independent teams reanalyze the same dataset to examine the same research question to assess heterogeneity in findings due to analytic decisions. Published in Advances in Methods and Practices in Psychological Science (Sliberzahn et al., 2018).

Preprints Assessment

Survey of 3,759 respondents from across research fields to identify attitudes, perceptions and reported actions related to preprints. Published in Royal Society Open Science (Soderberg et al., 2020).