Sciency Words: The Replication Crisis

Hello, friends!  Welcome back to Sciency Words, a special series here on Planet Pailly where we talk about new and interesting scientific terms so we can expand our scientific vocabularies together!  In this week’s episode of Sciency Words, we’re talking about:


There’s a quote that I hate which is frequently misattributed to Albert Einstein: “The definition of insanity is doing the same thing over and over again and expecting different results.”  Why do I hate this quote?  First off, as a matter of historical record, Einstein never said this.  But more importantly, doing the same thing over and over again to see if anything different happens is a surprisingly good definition of science.

Or it least it should be, which brings us to this week’s Sciency Word: the replication crisis.  As this brief introductory article retells it, the replication crisis began with “a series of unhappy events” in 2011.  Certain “questionable research practices” were exposed, along with several cases of outright fraud.  I’m going to focus on just one very noteworthy example: the American Psychological Association published a paper titled “Feeling the Future,” which claimed to show statistically significant evidence that human beings have precognitive powers.

When other researchers tried to replicate the “Feeling the Future” experiments, they failed to find this statistically significant evidence.  However, according to this episode of Veritasium, the American Psychological Association had a policy at the time that they would not publish replication studies, and so they would not publish any of the research debunking the original “Feeling the Future” paper (I do not know if they still have that policy—I would hope that they do not).

The act of repeating experiments to see if anything different happens is a crucial part of how science works.  Or rather how it should work.  But this is not being done often enough, it seems.  And on those rare occasions when replication studies are performed (and published), a shocking number of high profile research turns out to be non-replicable.  This article from sums up just how bad the replication crisis is:

One 2015 attempt to reproduce 100 psychology studies was able to replicate only 39 of them.  A big international effort in 2018 to reproduce prominent studies found that 14 of the 28 replicated, and an attempt to replicate studies from top journals Nature and Science found that 13 of the 21 results looked at could be reproduced.

That same article calls the replication crisis “an ongoing rot in the scientific process.”

But as I’ve been trying to say in several of my recent posts, science is self-correcting.  With the introduction of metascience—the scientific study of science itself—there is some hope that the root causes of the replication crisis can be identified, and perhaps changes can be made to the way the scientific community operates.

8 thoughts on “Sciency Words: The Replication Crisis

  1. I don’t think I’d ever heard of that “Feeling the Future” study, or the APA refusing to publish counters. Just the claim itself seems like it should have warranted much greater scrutiny prior to publication. There are a lot of hard nosed scientists in psychology, but the profession tolerates too many quacks.

    Liked by 2 people

    1. Yeah, I’m surprised that something like that would slip by peer review, even with all the problems the peer review process apparently has. I’m thinking about reading that paper out of morbid curiosity. When I looked it up, I was amused to see that it’s been cited an enormous number of times, but probably not for the reasons the author intended.

      Liked by 2 people

  2. I wasn’t so sure I shared your excitement about doing the same thing over and over again that we already know about to see if it came out differently. I do know some experiments that have been treated as -BEDROCK OF SCIENCE- that when I read the original paper, it was shaker than advertised. Definitely some things we are -SURE ABOUT- should have a re-check to make a firm foundation.
    Also, as a kid, I wanted to run a Laboratory of Things We Already Know*. I just didn’t see the hot air going out of the open upper window and cooler air coming in the open lower window. I mean, I could measure the temperature of the warmer air that floated higher in the room and cool (denser) air lower down, but how the the hot air know to go out the window and make room for the supposedly cooler air to come in the bottom window? Just didn’t seem to happen. We should test that!
    * I could get a grant from the people at Zoolander.

    Liked by 2 people

    1. That’s something I read in several different places: it turned out that several papers that were assumed to be bedrock science could not be replicated. A laboratory of things were already know would be great. A journal of things we already know would be great, too.

      Liked by 1 person

      1. Thanks for that. Yeah, some of the papers I read for my work were not as definitive as they were purported to be. They weren’t wrong about what they described. They just needed some follow-on work, which no one did since academics like to be known for new work, not re-plowing old ground.

        Liked by 1 person

      2. It seems like that is a big part of the problem: not wanting to re-plow old ground. I kind of think there should be people who specialize in doing replication studies. People who make it their mission in life to replicate as many studies as possible and who build a reputation and a career doing just that. If the right incentives were in place, I’m sure some people would do that, and it would be a great service to science.

        Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.