Previous: Why Do We Get the Spins When We’re Drunk?
Next: How Bad Helmets Gave Us a Map of Vision



View count:100,859
Last sync:2024-05-17 21:15
Hindsight bias skews our interpretation of events and information, making it seem like they were predictable or just not that surprising. This bias can cause some real problems, but the good news is, once you are aware of it, there are some things you can do to reduce its effects.

Hosted by: Hank Green
Support SciShow by becoming a patron on Patreon:

SciShow has a spinoff podcast! It's called SciShow Tangents. Check it out at
Huge thanks go to the following Patreon supporters for helping us keep SciShow free for everyone forever:

Kevin Bealer, KatieMarie Magnone, D.A. Noe, Charles Southerland, Eric Jensen, Christopher R Boucher, Alex Hackman, Matt Curls, Adam Brainard, Scott Satovsky Jr, Sam Buck, Avi Yashchin, Ron Kakar, Chris Peters, Kevin Carpentier, Patrick D. Ashmore, Piya Shedden, Sam Lutfi, charles george, Greg
Looking for SciShow elsewhere on the internet?

Image Sources:

In 1949, a group of sociologists published a book called The American Soldier. In it, they shared the results of years of research they'd done on soldiers in World War II.

It was based on a bunch of surveys and questionnaires and they learned a lot about life at war. Like, one survey found that soldiers from rural areas were generally happier than soldiers from the city. Another found that men were more eager to go home during times of fighting than after the war.

But the book didn't go over so well. People slammed the researchers for going to so much trouble to state the obvious. And maybe you've rolled your eyes at studies like that before — studies that don't seem to say anything surprising.

Like, why would someone pay researchers and take space in a prestigious journal just to say things that are perfectly intuitive? The thing is, that's actually a pretty common response to psychology studies, but not because so many of them are intuitive. The real reason is that we're often guilty of something called hindsight bias.

That's the tendency to think information is less surprising once you already know it. And it's just human nature. But the problem is, it doesn't just apply to science.

It can skew how much credit or blame you give people — including yourself — for things that happen. The good news is, once you know about it, there are things you can do to reduce that bias. Scientists first identified hindsight bias in 1975, decades after the backlash against.

The American Soldier. And researchers believe it happens because, once you learn something, your brain can't help but draw connections between that new information and all the other things you already know. So you suddenly recognize patterns that make that new information seem unsurprising.

Like, when you hear that the soldiers from rural areas tend to be in better spirits than other soldiers. You might think, well, that's pretty obvious because someone who grew up in the country might be more rugged — they would've adjusted better to the conditions of war. But what if the study had found the opposite?

What if city people were the happier ones overall? In that case, you might reason that city people had been exposed to more of the world, so they had an easier time overseas than a country boy who'd never left home. Either way, that's your hindsight bias speaking.

Things seem obvious… once you know them. That's a pretty harmless example, but in other situations, hindsight bias can cause real problems. Imagine you're a stock broker, for instance.

Every day, you have to make decisions about the best places to invest money — and that can be hard to predict, no matter how good you are at your job. But if you happen to make a bunch of investments that turn a large profit, you might think you earned that success — because you're just that good! And now that you're all confident, you might start making risky decisions, because hindsight bias has you thinking you know more than you do.

But imagine if the same investments had turned out poorly. Now you might be down on yourself for messing things up when, really, you couldn't have known better. So, whether an outcome is positive or negative, hindsight bias just tends to make you think it was more obvious than it actually was.

And that often ends up with someone getting too much credit or too much blame. That kind of thing can have really serious consequences. Like, if you're a doctor and someone thinks you should have caught a tumor sooner than you did, you can be sued for malpractice.

Or if your employee gets hurt on the job, someone could argue that you should have been able to prevent it. In both cases, a court may be responsible for deciding if you should have known better, and since the jurors know how the story ends, they're susceptible to hindsight bias. Accounting for this bias in legal situations is super complicated.

But on an individual level, once you're aware of it, there are ways to reduce it. One good strategy is called consider-the-opposite. All you have to do is stop for a second before you give credit or place blame, and imagine if the exact opposite scenario had happened.

Like, say you work at a company where an employee was caught stealing. And you have to decide if the employee's manager should be held responsible — after all, there were all sorts of warning signs. The employee was always coming in to work before or after hours, they spent an unusual amount of time visiting coworkers' offices, and they were obsessed with true crime shows.

Surely the manager should have known something was up! Except, consider the opposite situation. If the employee had been making a lot of money — instead of stealing from people's desks — those same “warning signs” might not have seemed suspicious at all.

In fact, you might think it was obvious that this was your star employee, especially because of all that extra time they spent connecting with their coworkers and working after hours. It turns out, if you just think about the fact that other outcomes were possible, you're less likely to assume that any single outcome was obvious. Hindsight bias is the price we have to pay for the fact that we have hindsight at all, which is generally a good thing.

Being able to look back and connect the dots between things that have happened helps us make better decisions and learn from our mistakes. Basically, it's a great tool for planning for the future. But when it comes to making sense of the past, it's worth remembering that hindsight isn't always reliable.

That bias is part of us, though, so the better we can understand it, the better we can work around it. Thanks for watching this episode of SciShow Psych! And thanks especially to our patrons on Patreon for your support.

If you like the show, they're the reason it exists! It takes a lot of different people to make a SciShow episode, and we couldn't do it without your help. If you'd like to learn more about how you can support us, head over to {♫Outro♫}.