YouTube: https://youtube.com/watch?v=d2r7Bk1NlgU
Previous: Family, Power, and Legacy: Crash Course Latin American Literature #9
Next: Statistical Thinking in Science: Crash Course Scientific Thinking #2

Categories

Statistics

View count:691,083
Likes:29,947
Comments:340
Duration:11:49
Uploaded:2026-01-20
Last sync:2026-04-14 00:30

Citation

Citation formatting is not guaranteed to be accurate.
MLA Full: "Introduction to Cognitive Bias: Crash Course Scientific Thinking #1." YouTube, uploaded by CrashCourse, 20 January 2026, www.youtube.com/watch?v=d2r7Bk1NlgU.
MLA Inline: (CrashCourse, 2026)
APA Full: CrashCourse. (2026, January 20). Introduction to Cognitive Bias: Crash Course Scientific Thinking #1 [Video]. YouTube. https://youtube.com/watch?v=d2r7Bk1NlgU
APA Inline: (CrashCourse, 2026)
Chicago Full: CrashCourse, "Introduction to Cognitive Bias: Crash Course Scientific Thinking #1.", January 20, 2026, YouTube, 11:49,
https://youtube.com/watch?v=d2r7Bk1NlgU.
Ever fallen for fake news on social media because it felt real? Or found yourself afraid to go in the ocean after watching too many videos about sharks? In this episode of Crash Course Scientific Thinking, we explore how cognitive bias shapes the way we think about the world, and how scientific thinking can help us overcome it.













Introduction: Earth as the Center 00:00






Recognizing Patterns 0:44






Common Cognitive Biases 2:44






Overcoming Bias Through Science 6:02






How to Combat Cognitive Bias 8:53






Review & Credits 10:18













Sources: https://docs.google.com/document/d/1Kv2_mFsDxQURuFDxpflD45dy-pCxlqnM9iPSNYxigmQ/edit?tab=t.0#heading=h.vk5i2fgqyumq













***






Support us for $5/month on Patreon to keep Crash Course free for everyone forever! https://www.patreon.com/crashcourse






Or support us directly: https://complexly.com/support






Join our Crash Course email list to get the latest news and highlights: https://mailchi.mp/crashcourse/email






Get our special Crash Course Educators newsletter: http://eepurl.com/iBgMhY













Thanks to the following patrons for their generous monthly contributions that help keep Crash Course free for everyone forever:






DexcilaDou, Martin G. Diller, Johnathan Williams, Allison Wood, EllenBryn, Katrix , Jason Terpstra, Evan Nelson, Jennifer Wiggins-Lyndall, SpaceRangerWes, Dalton Williams, Chelsea S, Thomas Sully, Matthew Fredericksen, AThirstyPhilosopher ., Michael Maher, Mitch Gresko, Gina Mancuso, Roger Harms, Shruti S, Quinn Harden, Reed Spilmann, Brandon Thomas, Emily Beazley, Rie Ohta, oranjeez, UwU, Elizabeth LaBelle, Leah H., David Fanska, Andrew Woods, Katie Hoban, Kevin Knupp, Barbara Pettersen, Ken Davidian, Stephen Akuffo, Toni Miles, Steve Segreto, Kyle & Katherine Callahan, Laurel Stevens, Tanner Hedrick, Kristina D Knight, Samantha, Krystle Young, Perry Joyce, Scott Harrison, Alan Bridgeman, Breanna Bosso, Matt Curls, Liz Wdow, Jennifer Killen, Duncan W Moore IV, Sarah & Nathan Catchings, team dorsey, Bernardo Garza, Trevin Beattie, Pietro Gagliardi, John Lee, Eric Koslow, Indija-ka Siriwardena, Jason Rostoker, Siobhán, Ken Penttinen, Nathan Taylor, Barrett, Les Aker, ClareG, Rizwan Kassim, Constance Urist, Alex Hackman, Triad Terrace, Katie Dean, Jason Buster, Emily T, Stephen McCandless, Thomas, Joseph Ruf, Wai Jack Sin, Ian Dundore, Erminio Di Lodovico, Evol Hong, Tandy Ratliff, Caleb Weeks, Luke Sluder






__













Want to find Crash Course elsewhere on the internet?






Instagram - https://www.instagram.com/thecrashcourse/






Facebook - http://www.facebook.com/YouTubeCrashCourse






Bluesky - https://bsky.app/profile/thecrashcourse.bsky.social













CC Kids: http://www.youtube.com/crashcoursekids
Hank Green: Two thousand years ago, people looked up at the sky, and they saw that everything up there seemed to move. So, naturally, the Earth was staying still and everything else was rotating around us. It was a story that just made sense to millions of people. And it stuck around well into the 16th century.

If I’d been alive then, I would have believed this story. I mean, even today I feel like I must be at the center of something. That idea, of course, was wrong.

But we didn't believe it forever. We found ways to step out of our old stories, and find something much more interesting… Hi, I’m Hank Green, and this is Crash Course Scientific Thinking.

[0:39] [THEME MUSIC]

[0:44] Hank Green: Science! It is a never-ending quest for knowledge. A way of interrogating our universe to figure out how it works, a tool to guide us when our intuition isn’t enough. And also, it can be quite fun.

Sometimes you get to blow stuff up! In the years since Copernicus put forth the theory that the Earth revolves around the Sun, we’ve learned that some questions are just too big, too complex, or too bizarre to trust our gut with. When we rely on intuition alone to answer those big, complicated questions, our brains fall prey to cognitive biases, predictable weaknesses in the way we’ve evolved to think.

Our brains are very good at finding patterns. We’ve evolved this skill because it’s super helpful for survival. It helped our ancestors spot the tell-tale signs of predators and recognize when certain plants might be poisonous.

We have always paid attention to and learned from our world. Those pattern recognition skills have also been linked to some very special human qualities, like our ability to imagine and invent. Like, it’s what made me notice that “Hank” and anglerfish sound vaguely similar so that I could invent something called the Hanklerfish.

Bad puns are still good pattern recognition. It’s also why we are so good at telling stories, because, really, that's all a story is: a recognizable pattern of information. And more importantly, our highly evolved pattern recognition skills allow our brains to apply mental shortcuts, or heuristics, that help us solve simple problems quickly and make life livable.

They are the brain’s way of copy-pasting stories we already have onto new information so that we don’t expend a bunch of brainpower in every direction all the time. They’re why I don’t have to stop and think “What will happen if I touch this hot stovetop?” My brain picked up the pattern of “touching hot things bad” long ago, and it keeps resurfacing it whenever I need it to keep me safe. And that’s all well and good for avoiding hot things.

But those same mental shortcuts also open us up to cognitive bias. Now, cognitive bias isn’t inherently bad. And that’s good because everybody’s got it.

And I'm not talking about explicit bias here, where someone is consciously aware that they are discriminating against a person. Cognitive biases happen unconsciously. They are implicit biases, when our decision-making is influenced by beliefs and patterns we aren’t even consciously aware of.

Consider this modern-day example of a cognitive bias skewing many people’s perception of risk. In early 2025, after a midair collision between a commercial airplane and a military helicopter, people started paying a lot of attention to every near miss and airport mishap. And it felt like planes were crashing every day.

The media ran with this, and the algorithms amplified it. At the time, 65% of Americans said they felt more anxious about flying. But when we took a look at the actual data, the number of accidents compared to the same time in 2024 remained the same.

And flying remained an incredibly safe form of travel. So why did so many of us feel like it wasn’t? Well, for efficiency, our brains often put more weight on the most readily "available" information around.

We call this, wait for it, availability bias, when people make judgments based on the information that’s easily available. The truth is, you’re far more likely to be in a car crash than a plane crash. But when plane crashes do happen, we hear about them a lot.

Especially in today’s algorithm-driven news cycle. So, that information is way easier to call to mind than the fact that over 120 people in the U. S. die in car accidents every day.

Availability bias is a big way our brains mislead us. But probably the biggest is confirmation bias, that’s our brain’s tendency to accept information that agrees with things we already believe, and filter out stuff that contradicts it. Like, if someone already believed that flying was dangerous, then the news stories about the 2025 crash likely reinforced that belief.

Or here’s an example you might be familiar with. Have you ever been told that you are a visual learner? Or maybe that you learn best by listening to other people?

In a study published in the Journal of Educational Psychology, more than 90% of participants said that people learn better when they’re taught using the “learning style” that best suits them. A similar survey of colleges in the US revealed that out of the 39 surveyed, 29 of them teach “learning style theory” as part of their guidance for teachers. But here’s the kicker: there is no scientific evidence to support the idea of personalized learning styles.

So why is this myth so prevalent? Researchers have pointed out that it persists, at least in part, thanks to confirmation bias. As one researcher put it, “People are obviously different and learning styles appear to offer educators a way to accommodate individual learner differences.” So someone who believes they’ve seen these methods have a positive impact might reject evidence against them.

Just like I might, and sometimes do, reject evidence that the butt is not part of the legs. And these are just a few of the cognitive biases we all have. We haven’t even gotten into how we cling to first impressions (that’s anchoring bias), or how we tend to believe the events of the past were predictable (that's hindsight bias).

Ultimately, we all want the world to make sense, but when we rely on these simple shortcuts for the wrong things, they can keep us from being open to evidence. So what do we do about this? Well, over the last few centuries, humans have developed a new way of looking at the world...one that doesn't just explain what feels right, but tests what is right.

This is what people are usually talking about when they say the word “Science.” Not the body of knowledge, but the systems used to interrogate the universe. From astronomy to zoology, science is a way of building knowledge that’s durable, communal, and long-lasting. And to help us learn more, I think we need a little Sage Advice.

[6:35] [funky retro music plays]

[6:43] SAGE: Let’s give it up for science, everyone!

[6:45] HANK: Hi Sage! Everybody, this is Sage.

[6:47] SAGE: Hello! I'm Sage the Bad Naturalist, I'm a dork, a painter, and creator of the YouTube Channel... Sage the Bad Naturalist! I make videos about fungi, plants, research papers, and learning something new, even when science goes wrong.

And I’m here to help Hank spread some… Sage advice. I love that we’re talking about cognitive bias today, Hank. Because the process of science is actually designed to overcome biases—from methods, to reliance on evidence, and especially the fact that science is communal.

It gets vetted by a whole community, not just one guy in a bathtub shouting “Eureka!”

[7:23] HANK: No shade to Archimedes, of course. I personally love peer review, which we’ll talk about in a later episode. But Sage, what is your favorite bias-busting science method?

[7:32] SAGE: For me, it has to be randomized controlled trials. It’s a multi-step process for research that’s used a lot in testing new medicine, and each step was designed to reduce the chance of bias. Say scientists want to test a new diabetes medicine. Well, there’s a lot of potential for bias in that process.

Like, they might accidentally influence the results of the trial in their selection of participants.

[7:54] HANK: That’s why we say everyone has cognitive biases. Even scientists.

[7:58] SAGE: Exactly! So to avoid those biases, scientists select the members at random. And when they’re testing a new drug, there's all kinds of potential for confirmation bias. So they sort some of the members into a control group that gets either no treatment, a placebo—which looks like the real drug but doesn’t actually do anything— or an older, proven drug.

That way, they can compare the results.

[8:21] HANK: I love a randomized controlled trial because it is such a good example of the ways scientists have recognized their own potential for bias and designed their research to reduce it as much as possible. Like how sometimes, when researchers need to eliminate bias even further, they do double-blind studies, where neither the participants nor the scientists know which people are in the trial and which are in the control.

[8:42] SAGE: Right?! Scientific thinking at work! We’ve come so far from thinking we were the center of the solar system.

[8:47] BOTH: Science hi-five!

[8:49] SAGE: And that’s been your Sage Advice!

[8:51] HANK: Thanks, Sage! And here’s the coolest part: you can watch out for cognitive bias in your own thinking, too. One of the easiest and most important things you can do to fight back against bias is just understanding that it’s real, and accepting that you have it. Being aware of bias gives you the chance to look out for its influence on your decisions.

Anybody who says they don’t have any biases is just waiving a huge red flag. Another way is to interact with lots of people, especially people who are different from you. Bias likes to tell us that our experience is the only reality.

But to really understand the world, we need community. Scientists do this, too. They are endlessly testing and vetting each other's claims.

Without expertise from a diversity of scientists, the scientific process would fail. And by that same token, you can’t overcome your biases on your own, either. Unlike those gut feelings we get, science requires evidence.

So whenever we consume science news that goes against an idea or experience we believe to be true, it's good to remember that our cognitive bias might be working against us. Which ties into another big bias-buster, cognitive flexibility, your ability to imagine options or explanations beyond your gut reaction. In other words, being able to say, ”You know what?

Maybe I was wrong.” I know, right? In this economy? On this Internet?

You can do it, though. It’s good for you. Throughout this series, we’re gonna talk a lot more about how the scientific process works, so that when we see news stories about science, we’ll better understand what’s going on behind the scenes.

And that knowledge is gonna help us all respond better to the science on our social media feeds, in our group chats, and at our dinner tables. And by the way, isn’t it just extremely wild that our brains can think of ways to outsmart the ways they think? Remember, we all have cognitive biases, they’re not something to be ashamed of.

They’re just our brains’ way of solving problems faster. But the world is very weird, so often, our mental shortcuts don’t work. That’s where scientific thinking comes in.

Science relies on evidence evaluated by a community of experts, and it has systems that are designed to reduce bias. It’s not perfect. But it’s one of the best tools we have.

Next time, we’re gonna explore the wild world of statistics. I’ll see you then. This episode of Crash Course Scientific Thinking was produced in partnership with HHMI BioInteractive, bringing real science stories to thousands of high school and undergrad life science classrooms.

If you’re a teacher, visit their website for resources that explore the topics we discussed in today's video. Thanks for watching this episode of Crash Course Scientific Thinking, which was filmed in Missoula, Montana, and was made with the help of all these nice people. If you want to help keep Crash Course free for everyone, forever, you can join our community on Patreon.