Systematizing Confidence in Open Research and Evidence
The SCORE project is assessing the credibility of published social-behavioral science claims and developing methods to better assess credibility. An overview of the project goals and design is available as a preprint. Started in 2019, the project is in its final phase and recruiting collaborators to conduct reproduction, robustness, and replication studies of social-behavioral science findings.
Visit the Reproduction, Robustness, and Replication tabs now to learn how you can contribute to this project. Contributors are part of one of the largest collaborations in the history of social-behavioral research, receive financial awards for their contributions, and are eligible to be co-authors on the summary reports of these projects.
We are recruiting researchers and data analysts across the social-behavioral sciences for reproduction projects that use existing, original data. You will select an original finding or multiple findings to reanalyze, receive or find original data to test the original claim(s), plan and preregister the reanalysis, submit your preregistration to verify the approach with the project coordinators, and report your findings in a structured format. You will receive $100 - $4,800 for each reproduction study, and you will also be eligible for co-authorship on the report of all reproduction studies.
Joining the project is easy!
This spreadsheet contains all of the available projects that do not require additional human subjects review. The tabs correspond to different project categories that are explained below. If you would like to join: review the eligible projects and fill out the linked commitment form for the project(s) of interest.
Thanks so much for your collaboration on SCORE!
More detailed information follows below:
(1) Project sign-ups
Individuals or teams will reanalyze original data to reproduce the original findings. Data and code may be obtained by the COS team from the authors of the selected study to facilitate a push-button reproduction (PBR). If the author provides data but not usable code, analysts may conduct an author-data reproduction (ADR). If the dataset used in the original paper is inaccessible, reproduction analysts will acquire data from the original source and reconstruct the dataset to conduct a source-data reproduction (SDR).
Review available studies in this sign-up sheet and then complete a form to confirm interest and timeline feasibility. Sheets with the labels PBR, ADR, SDR, or Repro contain the available reproduction projects. The commitment form corresponding to each project is linked directly in the sign-up spreadsheet. Prospective analysts should ensure that they will have access to the necessary data before submitting a commitment form.
(2) Pregistration design and check-inReproduction studies use this preregistration template to define criteria for a successful reproduction. Once the criteria have been defined, reproduction preregistrations will be reviewed by COS. The primary goal of preregistration is to designate criteria for assessing reproduction success in advance of reanalyzing the data. The preregistration check-in is intended to assess coherence among the data, the claim(s) being reproduced, and the proposed success criteria for the claim(s). Once approved, preregistrations will be registered on the OSF before the analysis takes place.
(3) Completing the reproductionFollowing preregistration check-in and OSF registration, reproduction teams will reanalyze the original data, documenting their procedure from start to finish using a well-commented analysis script and narrative transparency trail that will also serve as the final report. To promote transparency, teams will upload their final preregistration, materials, data, analysis scripts, output files, and transparency trail documents to a shared OSF project that will be made public at the conclusion of the SCORE program.
(4) Study fundsThe incentive for each project is based on project type and priority level. Reproduction analysts will receive between $100 and $4,800 for each reproduction and eligibility for co-authorship on the final report about SCORE reproductions that COS will spearhead. Those interested may review the incentive structure in this spreadsheet. More information regarding each project type, priority designation, and payment is available in the link to the Analyst Agreement in the sign-up sheet or on this OSF page.
(5) Detailed expectations and timelineFor more information regarding the SCORE reproductions workflow, please review this document. Any questions may be directed to scorecoordinator@cos.io.
We are recruiting researchers and data analysts across the social-behavioral sciences for replication projects that use existing data that was not part of the original study. For these projects, you will select an original finding to replicate, receive or find alternative data to test the original claim, plan the preregistration of your analysis, receive peer review of your plan, and report your findings in a structured format. You will receive $3,000-$7,800 for each replication study, and you will also be eligible for co-authorship on the report of all replication studies.
Joining the project is easy!
Review the eligible projects using this form, and fill out the linked commitment form for those projects you are interested in.
Thanks so much for your collaboration on SCORE!
More detailed information follows below:
(1) Project sign-ups
Individuals or teams will use existing data not used in the original study to analyze a focal claim or set of focal claims. We call these types of projects “Data Analytic Replications” or “DARs”. Data may be obtained and shared with you by the COS team or you may need to look for relevant data not used in the original paper to test the focal claim or set of focal claims. Replications may also include generating new analytic code to test the focal claim or set of focal claims in the same manner as the original paper.
Review available studies in this sign-up sheet and then complete a form to confirm interest and timeline feasibility. Sheets with the label “DAR” contain available replication projects. The commitment form corresponding to each project is linked directly in the sign-up spreadsheet. Prospective analysts should ensure that they will have access to the necessary data before submitting a commitment form.
(2) Preregistration design, peer review, and check-in
Replication studies use this preregistration template to define methods and materials used and criteria for a successful replication. Once complete, the preregistration will be peer reviewed. The primary goal of preregistration is to ensure the project represents a high quality, good faith attempt at replicating the original focal claim(s), and that the proposed data and planned analysis strategy is adequate. The preregistration check-in is intended to assess coherence among the data, the claim(s) being replicated, and the proposed success criteria for the claim(s). Once approved, the preregistration is registered on OSF before starting the replication.
(3) Completing the replication
Replication teams will analyze the new data according to their preregistration plan. Once complete, analysts will fill out a final report template that will include statistical results and a reporting of materials used. All materials, including data and analytic code, will be stored and shared on the OSF project when possible.
(4) Study funds
The incentive for each project is based on the project type and priority level. Replication analysts will receive between $3,000-$7,800 for each replication project and eligibility for co-authorship on the final report about SCORE replications that COS will spearhead. Those interested may review the incentive structure in this spreadsheet. More information regarding each project type, priority designation, and payment is available in the link to the Analyst Agreement in the sign-up sheet or on this OSF page.
(5) Detailed expectations and timeline
For more information regarding the SCORE replications workflow, please review this document. Any questions may be directed to scorecoordinator@cos.io.
What is the Multi100 project?
The Multi100 project is a crowdsourced empirical project aiming to estimate how robust published results and conclusions in social and behavioral sciences are to analysts’ analytical choices. Involving more than 200 researchers around the world, we will investigate whether
- different analysts arrive at the same conclusions as the author of the original study
- different analysts arrive at the same effect estimates as the author of the original study
To answer these questions, 100 empirical studies published in different disciplines of social and behavioral sciences will be re-analyzed by independent researchers. The acceptability of the re-analyses will be judged in a round of peer evaluations. The results of the analyses will be compared in terms of the direction and magnitude of the effect. More information about the methods.
Why should you join?
Beyond advancing our understanding about research practices in social and behavioral science, all co-analysts become co-authors of the final publication if they 1) submit their analyses for two papers before the deadline, 2) at least one of their analyses passes the peer evaluation, 3) adhere to confidentiality requirements, and 4) review and approve the manuscript in time.
Furthermore, co-analysts and peer evaluators get monetary compensation for each of their submitted analyses and evaluations.
How to join as a co-analyst and/or a peer evaluator?
Anyone with experience conducting statistical analyses, reporting results, and documenting their work in code can join the project. If there are more eligible volunteers than available co-analyst roles, we will select collaborators on a first come, first served basis.
Please fill out this application form if you are interested in joining the project.
What will be your task?
Co-analysts
Your task will be to choose two papers from a database provided by COS and conduct an independent hypothesis-testing analysis on the data to answer the underlying scientific question of one selected claim from each paper. As a second step, we ask you to provide a single statistical result based on our further instructions. Then, you should upload the re-analysis code to an OSF folder and fill out a post-analysis survey with your results. Finally, you will be asked to review and approve the final manuscript of this project.
Peer evaluators
The overarching goal of the peer evaluation phase is to reach a binary determination whether or not a re-analysis constitutes an acceptable path in the universe of potential analyses that allows the claim of interest to be addressed.
Early 2022 - Application for the co-analysts
Apply for the project by filling this application form.
The project is led by:
The project management, data, and infrastructure will be provided by the Center for Open Science.
In case you have any questions regarding the project, please contact us via multi100@cos.io.
Chris Chartier
Director, Psychological Science Accelerator
Associate Professor, Psychology, Ashland University
Nick Fox
Research Scientist, Center for Open Science
Andrew Tyner
Research Scientist, Center for Open Science
Zachary Loomas
Project Coordinator, Center for Open Science
Olivia Miske
Project Coordinator, Center for Open Science
Bri Luis
Project Coordinator, Center for Open Science
Krystal Hahn
Project Coordinator, Center for Open Science
Priya Silverstein
Research Consultant, Independent Researcher
Simon Parsons
Data Manager, Center for Open Science
Titipat Achakulwisut
Lecturer, Mahidol University
Daniel Acuna
Assistant Professor, Information Studies, Syracuse University
Beatrix Arendt
Program Manager, Center for Open Science
Tim Errington
Director of Research, Center for Open Science
Brian Nosek
Executive Director, Center for Open Science
Professor, Psychology, University of Virginia
Center for Open Science
210 Ridge McIntire Road
Suite 500
Charlottesville, VA 22903-5083
Email: contact@cos.io
Current positions are here.
COS has consistently earned a Guidestar rating of Platinum for its financial transparency, the highest rating available. You can see our profile on Guidestar here.
We invite all of our sponsors, partners, and members of the community to learn more about how our organization operates, our impact, our financial performance, and our nonprofit status.
Unless otherwise noted, this site is licensed under a Creative Commons Attribution 4.0 International License.
All images and logos are licensed under Creative Commons Attribution-NoDerivatives 4.0 International License.
Terms of Use | Privacy Policy