Don Moore studies overconfidence in its many forms—including when people overestimate their abilities, their knowledge, or how they compare to others. This type of bias can show up everywhere from daily decision-making to published research findings. In the world of academic research, where misplaced certainty can derail progress, Moore is a strong advocate for transparency in conducting and sharing his work.
A professor and the Lorraine Tyson Mitchell Chair in Leadership and Communication at the Haas School of Business at UC Berkeley—and “only occasionally overconfident,” as his professional bio humorously notes—Moore is a longtime proponent of rigorous, transparent research practices. His use of the Open Science Framework (OSF) reflects that commitment, helping Moore and his collaborators share preregistrations, data, materials, and code to make their findings as robust and trustworthy as possible.
A healthy measure of transparency may be the best way to keep our collective overconfidence in check—whether we’re conducting research or just navigating everyday life. In this Q&A, Moore discusses the role of OSF in supporting his work, shares new approaches for preregistering analyses on rich field datasets, and offers advice to researchers just starting out in open science. His thoughtful reflections highlight how platforms like OSF can drive cultural change and improve the quality, reliability, and reach of scholarship across disciplines.
Q: Can you tell us about your research interests and any current projects you’re working on?
A: I am obsessed with the study of overconfidence, in its many forms:
I am working on revising a paper (with Mitchell Chan) that analyzes published estimates of physical constants, and finds that published estimates are too precise, implying excess certainty.
I just published a paper (with Sophia Li and Randall Hale) that tests the common intuition that some people are consistently more overconfident than others–that it operates as a personality trait. Our results do not support this intuition.
A recent publication (with Sandy Campbell) of which I am particularly proud identifies overprecision in the Survey of Professional Forecasters. What makes me especially proud of this project is the development of new approaches for the preregistration of analyses on rich field datasets. Our OSF site was critical for implementing these new approaches.
Q: As a frequent OSF user, which features or tools have been most valuable in supporting your research workflow?
A: The ability to build a public online repository is invaluable. I use OSF to share preregistrations, data, materials, and code. This sort of open science was not really possible before OSF.
Q: Your OSF projects, like this one, demonstrate how preregistrations, study materials, and published findings can be effectively linked in a single hub. How do you approach organizing these components, and are there practices you’d recommend to others?
A: Ordinarily, each study gets its own project designation. That project contains the study’s preregistration and study materials. We aspire to also preregister cleaning and analysis code but are rarely organized enough to do so. When the data come in, we can post the raw (anonymized) data file. Then once we have analyzed the data and written them up, we can post the draft, the code, and the cleaned data files. We lean toward making projects publicly immediately, since it often proves useful to share the pieces of a project with members of our research team or with colleagues.
Q: What initially sparked your interest in open science, and why do you believe these practices are important or beneficial?
A: I saw the pernicious effect of the proliferation of false positive research results in the literature. Our field lost credibility as key findings failed to replicate and we realized that we had built literatures entirely on sand. Doing better required that we, as a community of scholars, hold each other to higher standards of rigor, reporting, and replicability. I am delighted that the field is meeting that challenge–through resources like OSF.
Q: What advice would you offer to researchers who are new to OSF or just getting started with open science?
A: Don’t be intimidated by the rich diversity of features on OSF. Figure out how to do what you need.
Q: How do you envision open science practices and platforms like OSF shaping the future of research in your field?
A: I am hopeful that OSF will continue to evolve, with initiatives like the Lifecycle Journal, to bring us closer to scientific utopia. In that utopia, we move past antiquated systems of evaluation and distribution to a better future in which high-quality science is more reliable, more replicable, and more widely accessible.
6218 Georgia Avenue NW, Suite #1, Unit 3189
Washington, DC 20011
Email: contact@cos.io
Unless otherwise noted, this site is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) License.
Responsible stewards of your support
COS has earned top recognition from Charity Navigator and Candid (formerly GuideStar) for our financial transparency and accountability to our mission. COS and the OSF were also awarded SOC2 accreditation in 2023 after an independent assessment of our security and procedures by the American Institute of CPAs (AICPA).
We invite all of our sponsors, partners, and members of the community to learn more about how our organization operates, our impact, our financial performance, and our nonprofit status.