• Skip to content (access key 1)
  • Skip to search (access key 7)
FWF — Austrian Science Fund
  • Go to overview page Discover

    • Research Radar
      • Research Radar Archives 1974–1994
    • Discoveries
      • Emmanuelle Charpentier
      • Adrian Constantin
      • Monika Henzinger
      • Ferenc Krausz
      • Wolfgang Lutz
      • Walter Pohl
      • Christa Schleper
      • Elly Tanaka
      • Anton Zeilinger
    • Impact Stories
      • Verena Gassner
      • Wolfgang Lechner
      • Birgit Mitter
      • Oliver Spadiut
      • Georg Winter
    • scilog Magazine
    • Austrian Science Awards
      • FWF Wittgenstein Awards
      • FWF ASTRA Awards
      • FWF START Awards
      • Award Ceremony
    • excellent=austria
      • Clusters of Excellence
      • Emerging Fields
    • In the Spotlight
      • 40 Years of Erwin Schrödinger Fellowships
      • Quantum Austria
    • Dialogs and Talks
      • think.beyond Summit
    • Knowledge Transfer Events
    • E-Book Library
  • Go to overview page Funding

    • Portfolio
      • excellent=austria
        • Clusters of Excellence
        • Emerging Fields
      • Projects
        • Principal Investigator Projects
        • Principal Investigator Projects International
        • Clinical Research
        • 1000 Ideas
        • Arts-Based Research
        • FWF Wittgenstein Award
      • Careers
        • ESPRIT
        • FWF ASTRA Awards
        • Erwin Schrödinger
        • doc.funds
        • doc.funds.connect
      • Collaborations
        • Specialized Research Groups
        • Special Research Areas
        • Research Groups
        • International – Multilateral Initiatives
        • #ConnectingMinds
      • Communication
        • Top Citizen Science
        • Science Communication
        • Book Publications
        • Digital Publications
        • Open-Access Block Grant
      • Subject-Specific Funding
        • AI Mission Austria
        • Belmont Forum
        • ERA-NET HERA
        • ERA-NET NORFACE
        • ERA-NET QuantERA
        • Alternative Methods to Animal Testing
        • European Partnership BE READY
        • European Partnership Biodiversa+
        • European Partnership BrainHealth
        • European Partnership ERA4Health
        • European Partnership ERDERA
        • European Partnership EUPAHW
        • European Partnership FutureFoodS
        • European Partnership OHAMR
        • European Partnership PerMed
        • European Partnership Water4All
        • Gottfried and Vera Weiss Award
        • LUKE – Ukraine
        • netidee SCIENCE
        • Herzfelder Foundation Projects
        • Quantum Austria
        • Rückenwind Funding Bonus
        • WE&ME Award
        • Zero Emissions Award
      • International Collaborations
        • Belgium/Flanders
        • Germany
        • France
        • Italy/South Tyrol
        • Japan
        • Korea
        • Luxembourg
        • Poland
        • Switzerland
        • Slovenia
        • Taiwan
        • Tyrol–South Tyrol–Trentino
        • Czech Republic
        • Hungary
    • Step by Step
      • Find Funding
      • Submitting Your Application
      • International Peer Review
      • Funding Decisions
      • Carrying out Your Project
      • Closing Your Project
      • Further Information
        • Integrity and Ethics
        • Inclusion
        • Applying from Abroad
        • Personnel Costs
        • PROFI
        • Final Project Reports
        • Final Project Report Survey
    • FAQ
      • Project Phase PROFI
      • Project Phase Ad Personam
      • Expiring Programs
        • Elise Richter and Elise Richter PEEK
        • FWF START Awards
  • Go to overview page About Us

    • Mission Statement
    • FWF Video
    • Values
    • Facts and Figures
    • Annual Report
    • What We Do
      • Research Funding
        • Matching Funds Initiative
      • International Collaborations
      • Studies and Publications
      • Equal Opportunities and Diversity
        • Objectives and Principles
        • Measures
        • Creating Awareness of Bias in the Review Process
        • Terms and Definitions
        • Your Career in Cutting-Edge Research
      • Open Science
        • Open-Access Policy
          • Open-Access Policy for Peer-Reviewed Publications
          • Open-Access Policy for Peer-Reviewed Book Publications
          • Open-Access Policy for Research Data
        • Research Data Management
        • Citizen Science
        • Open Science Infrastructures
        • Open Science Funding
      • Evaluations and Quality Assurance
      • Academic Integrity
      • Science Communication
      • Philanthropy
      • Sustainability
    • History
    • Legal Basis
    • Organization
      • Executive Bodies
        • Executive Board
        • Supervisory Board
        • Assembly of Delegates
        • Scientific Board
        • Juries
      • FWF Office
    • Jobs at FWF
  • Go to overview page News

    • News
    • Press
      • Logos
    • Calendar
      • Post an Event
      • FWF Informational Events
    • Job Openings
      • Enter Job Opening
    • Newsletter
  • Discovering
    what
    matters.

    FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

    SOCIAL MEDIA

    • LinkedIn, external URL, opens in a new window
    • , external URL, opens in a new window
    • Facebook, external URL, opens in a new window
    • Instagram, external URL, opens in a new window
    • YouTube, external URL, opens in a new window

    SCILOG

    • Scilog — The science magazine of the Austrian Science Fund (FWF)
  • elane login, external URL, opens in a new window
  • Scilog external URL, opens in a new window
  • de Wechsle zu Deutsch

  

How biased is the literature in Psychological Science?

How biased is the literature in Psychological Science?

Jakob Pietschnig (ORCID: 0000-0003-0222-9557)
  • Grant DOI 10.55776/P31002
  • Funding program Principal Investigator Projects
  • Status ended
  • Start February 1, 2018
  • End November 30, 2021
  • Funding amount € 291,987
  • Project website

Disciplines

Other Social Sciences (50%); Psychology (50%)

Keywords

    Open Science, Reproducibility, Dissemination Bias, Scientific Integrity, Decline Effect, Research Synthesis

Abstract Final report

The confidence crisis in psychological science has come by now even into the focus of popular media. Especially, science scandals pertaining to data fabrication as for instance evidenced in the prominent research fraud case of Diederik Stapel in the Netherlands received considerable attention through the public. Although such cases attract the most attention, there are further mechanisms in the scientific process that pose a considerably stronger threat for the validity of empirical research results. Non- replicable findings, voodoo correlations, and zombie theories permeate the scientific literature and lead oftentimes to the adoption of spurious results. This does not only negatively affect the scientific process but can also lead to dramatic for real-world implications. For instance, a large-scale meta- analysis of randomized controlled trials about the effectiveness of certain interventions within the clinical contexts showed that initial effect estimates and early published studies showed stronger average treatment effects than subsequently published studies. Such effect overestimations can be even observed if as is typically the case in clinical intervention studies studies have been preregistered. Consequences of effect inflation are exacerbated by the fact that strong, surprising, hypothesis- conforming, and significant results are published more often, quicker, and more visible (i.e., in journals with higher impact factors) and in turn are cited more frequently than smaller (i.e., typically more realistic) effect estimates. Therefore, false and inflated effects are often prominently communicated in the literature. The increased awareness of the scientific community about such problems has led to considerable efforts particularly in recent years to increase the transparency and replicability of empirical studies. The present project aims to extend these efforts and it is planned provide a contribution to the estimation of prevalence and strength of bias in the empirical literature. To this end, we intend to assess, extract, and reanalyze data of all published meta-analyses from five of the most authoritative journals in psychology. By means of application of standard and specialized methods of research synthesis, we plan to achieve five goals: 1.) assess the average strength of effect inflation in initial publications compared to meta-analytic summary effects, 2.) calculate average annual effect declines, 3.) assess moderating influences of study characteristics as well as visibility and authority of the journal where initial studies were published, 4.) estimate the prevalence and effect misrepresentation in the literature, and 5.) provide estimates for the prevalence of dissemination bias based on seven modern methods for bias detection and compare results with originally reported bias estimates. Our results will be useful to inform authors, reviewers, and readers alike about the potential evidential value of initially published novel results and provide a reasonable estimate for expectable effect changes over time.

Due to their nature, results from empirical research studies may represent a more or less accurate picture of reality. Less representative findings may occur due to inappropriate study designs, approaches, or interpretations but may also represent chance effects. Whilst peer-review is intended to serve as a failsafe against the publication of suboptimally designed or interpreted studies, chance findings are largely immune against detection from peers. In empirical research, chance findings manifest themselves by inaccurate effect estimates, leading to over- or underestimation of investigated study effects or the passing of a statistical threshold indicating significance of a given effect, although in reality there is none. This problem is typically dealt with by independent replications of newly established effects which will lead to more accurate effect estimations once sufficient data have been accumulated. However, a central assumption of the scientific method is that effect over- and underestimations happen at about an equal amount of time and one scenario does not occur more often than the other. At the core of the present project lies the idea that this is not the case because exploratory studies systematically report an overproportional number of overestimated compared to underestimated effects throughout the literature in Psychology, regardless of the investigated research question. This is particularly problematic because exploratory studies get more attention and are cited more often, thus achieving a status of unfounded authority compared to replications. By examining results of more than 570 research syntheses with over 51 million participants that were published in five flagship journals in Psychological Science, we showed in our project that decline effects (i.e., indicating overestimation of the initial effect in a given field, thus leading to decreasing effect sizes over time) are twice as likely to occur in the literature than effect increases. Moreover, these declines are considerably stronger compared to the increases. These findings may be attributed to publication-related mechanisms that incentivize the publication of underpowered studies with spectacular (but unlikely) results. This interpretation is corroborated by our observation that the largest (and therefore most spectacular) effects that were reported in initial studies represented the most inaccurate estimates. In other words, the most breathtaking effects were most likely to be wrong. Although our results are so far only based on studies that have been published in Psychology, we expect that these results generalize to other empirical disciplines as well. Our findings are a testament to the importance of the use of modern open science practices in empirical research but illustrate a need to move beyond mere voluntary preregistering and data sharing. Reforming editorial policies, incentivizing accurate instead of spectacular effect publication, and application of state-of-the-art bias detection methods are necessary means to improve confidence in empirical research.

Research institution(s)
  • Universität Wien - 100%
International project participants
  • Jelte (J.M.) Wicherts, Tilburg University - Netherlands

Research Output

  • 68 Citations
  • 4 Publications
  • 1 Methods & Materials
Publications
  • 2019
    Title Directional and regioselective hole injection of spiropyran photoswitches intercalated into A/T-duplex DNA
    DOI 10.1039/c9cp03398j
    Type Journal Article
    Author Avagliano D
    Journal Physical Chemistry Chemical Physics
    Pages 17971-17977
    Link Publication
  • 2019
    Title Effect Declines Are Systematic, Strong, and Ubiquitous: A Meta-Meta-Analysis of the Decline Effect in Intelligence Research
    DOI 10.3389/fpsyg.2019.02874
    Type Journal Article
    Author Pietschnig J
    Journal Frontiers in Psychology
    Pages 2874
    Link Publication
  • 2020
    Title Times are Changing, Bias isn’t: A Meta-Meta-Analysis on Publication Bias Detection Practices, Prevalence Rates, and Predictors in Industrial/Organizational Psychology
    DOI 10.31234/osf.io/mtv2h
    Type Preprint
    Author Siegel M
    Link Publication
  • 2022
    Title Times Are Changing, Bias Isn’t: A Meta-Meta-Analysis on Publication Bias Detection Practices, Prevalence Rates, and Predictors in Industrial/Organizational Psychology
    DOI 10.1037/apl0000991
    Type Journal Article
    Author Siegel M
    Journal Journal of Applied Psychology
    Pages 2013-2039
    Link Publication
Methods & Materials
  • 2020 Link
    Title Meta-Shine: A one-stop-shop for calculating effect estimates, moderator analyses, and bias detection methods for meta-analysis
    Type Improvements to research infrastructure
    Public Access
    Link Link

Discovering
what
matters.

Newsletter

FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

Contact

Austrian Science Fund (FWF)
Georg-Coch-Platz 2
(Entrance Wiesingerstraße 4)
1010 Vienna

office(at)fwf.ac.at
+43 1 505 67 40

General information

  • Job Openings
  • Jobs at FWF
  • Press
  • Philanthropy
  • scilog
  • FWF Office
  • Social Media Directory
  • LinkedIn, external URL, opens in a new window
  • , external URL, opens in a new window
  • Facebook, external URL, opens in a new window
  • Instagram, external URL, opens in a new window
  • YouTube, external URL, opens in a new window
  • Cookies
  • Whistleblowing/Complaints Management
  • Accessibility Statement
  • Data Protection
  • Acknowledgements
  • IFG-Form
  • Social Media Directory
  • © Österreichischer Wissenschaftsfonds FWF
© Österreichischer Wissenschaftsfonds FWF