YouTube: https://youtube.com/watch?v=dRiuhK4a9xE
Previous: Why You Think You Look Better in Selfies
Next: Why YOU Should Take a Break to Watch This Video

Categories

Statistics

View count:230,308
Likes:4,308
Comments:252
Duration:31:05
Uploaded:2021-06-10
Last sync:2024-04-17 07:45
You have a lot of choices if you’re looking to learn a new language, from Spanish to coding, or even whistling! And there are some broad similarities and patterns in the ways our brains process these different forms of communication.

Hosted by: Brit Garner

Original Episodes:
Is Coding a Math Skill or a Language Skill? Neither? Both?
https://youtu.be/xPecMsFmEm4
What Whistled Speech Tells Us About How the Brain Interprets Language
https://youtu.be/7gvfDB5_cz4
Will Learning Another Language Make You Smarter?
https://youtu.be/wf_M5l2y3sM
Want to Speak a Foreign Language Better? Have a Drink
https://youtu.be/mDxSU9LJxOc
What We Often Get Wrong About the Brain's "Language Centers"
https://youtu.be/kjwIgKa560s
----------
Support SciShow Psych by becoming a patron on Patreon: https://www.patreon.com/SciShowPsych

SciShow has a spinoff podcast! It's called SciShow Tangents. Check it out at https://www.scishowtangents.org
----------
Become a Patron and have your name featured in the description of every SciShow Psych episode! https://www.patreon.com/SciShowPsych
----------
Looking for SciShow elsewhere on the internet?
Facebook: http://www.facebook.com/scishow
Twitter: http://www.twitter.com/scishow
Tumblr: http://scishow.tumblr.com
Instagram: http://instagram.com/thescishow
----------
Images:
https://www.istockphoto.com/photo/woman-talking-letters-in-head-coming-out-of-open-mouth-gm486962714-73857403
https://www.storyblocks.com/video/stock/computer-hacking-in-process-cyber-security-concept-b-rnsuel-j2n7ypg4
https://www.storyblocks.com/video/stock/medium-long-shot-of-teacher-explaining-to-pupils-in-a-chinese-language-class-budfdclbsjy0qcgg3

 (00:00) to (02:00)



[INTRO MUSIC]

BRIT GARNER:
You have a lot of choices if you're looking to learn a new language, from Spanish to coding to even whistling. See, there are some broad similarities and patterns in the ways our brains process these different forms of communication. Let's have a look.

We'll kick things off with computer code, which sounds more like math than a traditional language. But since many folks in the future might end up learning to code at some point, researchers are interested in understanding exactly how are brain interprets it.

Does it see code as a language? And can that help us learn to code more effectively? Well, let me explain.

Scientists generally understand what parts of the brain are involved in talking, writing, and solving math problems. But when it comes to reading and writing computer code, the brain is more of a mystery.

Programming languages like Python and C++ are kind of like languages because they contain words and abbreviations that follow rules like grammar. But they're also full of symbols and variables which might make them seem similar to algebra.

Many people have wondered: Does the brain process code as a language or as math?

The question may seem abstract, and the answer isn't black and white. But understanding how the brain interprets code can help us teach it better. And it can also give us unique insights into how our brains work.

High level programming languages like Python and C++ provide instructions to computers in a way that both computers and people can understand. They were developed as a happy medium between human language, which is hard for computers to process, and computers native language of binary code, which is really hard for people to comprehend.

And understanding how we understand programming has the potential to reveal how the brain learns novel complex skills.

Humans didn't evolve with special neural regions dedicated to computer programming. After all, C++, Java, and Python didn't even exist 50 years ago.

But now, many people are fluent in coding languages. So studying code cognition can give us clues to how the brain repurposes its circuitry to process weird, novel things its never seen before over the course of evolution, which is really cool.

 (02:00) to (04:00)



Plus, it's practical. We may be headed for a future where lots of people need to learn code—at least to some extent. So it's essential to discover how we can teach coding effectively.

But to accomplish that, we have to figure out how our brains interpret code. And here there have been 2 main schools of thought.

One hypothesis is that your brain categorizes code like it would Swahili or Spanish. That is, it activates the language system: a network of regions involved in linguistic tasks.

The other ideas is that your brain co-ops the neural circuitry we use for some kinds of math, including symbolic math like algebra.

The math-centric parts of the brain are located in the multiple demand system, a larger network of regions involved in all sorts of logic and problem solving tasks.

Researchers have been debating all of this for a while, without a clear answer. Then, in a 2020 study, neuroscientists from the Massachussets Institute of Technology investigated the question using brain imaging.

They recruited 43 proficient coders, put them in an fMRI machine, and had them solve problems shown either as a sentence or using snippets of code.

About half the participants were presented with Python, a popular text-based programming language. The others used ScratchJr, a visual programming language designed for kids, which was expected to have less effect on the language system.

And based on the fMRI results, it turns out that neither the language nor the math camps have the whole story right.

The study found that when most participants read and built their code, the language processing area of their brains wasn't tapped much at all. And the multiple demand system was strongly activated, but not in the regions you;d expect for a purely math-related task.

See, many math tasks involve certain regions of the left hemisphere—at least, for right-handed people, like those in the study. But coding recruited large parts of the multiple demand system in both the right and left hemispheres that are involved in logic and all kinds of complex cognitive tasks.

Essentially, when we read and comprehend code, we use the same parts of the brain we use when we think really hard about solving a challenging problem, not a basic math exercise.

This means your brain doesn't see coding as strictly language or math.

 (04:00) to (06:00)



It’s an uncategorized thing that we process in a unique way, which is pretty neat.

But it makes life more difficult for instructors teaching coding. Because if brains saw code just as words, we could teach it like we teach foreign languages. And if brains saw code just as math, we could teach it like we teach algebra or trigonometry.

But since it interprets code as neither of those, computer science educators will have to get creative and teach coding like it’s something totally new. Which, of course, it is.

Except it gets more complicated than that because it might also be a matter of timing.

The MIT researchers hypothesize that people taught coding as kids might use their language system more compared to adults. That’s because in kids, the language system is still developing, and what their brain sees as language isn’t set in stone yet.

In contrast, the multiple demand system may remain more flexible throughout a person’s life, so it might be more activated when adults learn code.

That means that when we teach coding might make a big difference in how we should teach it.

In addition, learning coding might actually rewire the brain. When we repeatedly perform an action, whether it’s reading books or playing the guitar, the brain forms new and stronger connections between the neurons involved.

The MIT researchers think that people who’ve been coding for decades could form specialized regions in their multiple demand system that are dedicated to Python, or C++, or whatever programming languages we invent in the future. That means we have more to learn if we want to harness that flexibility when we teach.

But it’s also a pretty amazing example of what our brains can do when we invent brand new situations for them to deal with.

But what happens when instead of a string of code, you just have sound? And I'm not talking about the sound of my beautiful voice. No, I'm really not. I'm not talking about that.

I'm talking about whistling. Which, fun fact, I actually can't do.

So, how does our brain tackle that one? Here's Anthony with more.

ANTHONY BROWN:
On every continent besides Antarctica, you can find groups of people who communicate by whistling. And not just like, to get someone's attention or say, "I'm over here!"

 (06:00) to (08:00)



These are full-blown languages.

For a long time, all sorts of researchers have been interested in this form of communication, which seems to blur the lines between language and music. And recently, research by neuroscientists has shown that the way language works in the brain is not exactly what we thought.

Although whistled languages emerged separately all over the world, they tend to have some key features in common. Like, they usually exist among groups of people living in spread-out communities who need to communicate over long distances.

Because conveniently, the high pitch and volume of a whistle can travel a lot farther than regular speech—up to 10 kilometers away, in some cases.

And not only do whistles travel farther, they also stand out more against the sounds of rivers, wind, and animal calls, so they can be a great communication tool in rugged environments. Like, in the mountains of Oaxaca, Mexico, police actually used their whistled language for years before they switched to walkie-talkies.

But these languages don't just replace speech. In many cases, whistled languages are adaptations of the local spoken language.

And that's one of the things that makes them so interesting to neuroscientists. Because, oddly, even though they're two versions of the same language, the brain doesn't process them the same way.

Two whistled languages that have been studied in depth are Silbo, which is based on Spanish and found in the Canary Islands, and a Turkish-based language found in the mountains of Turkey.

Here's a sample of Silbo [sample of Silbo plays]. And here's a sample of the Turkish-based language [sample of Turkish-based whistled language plays].

They both imitate the intonations and stresses of speech, which can easily be reproduced in whistles. Meanwhile, the things that can't be imitated, like vowels and consonants, get translated into cues like rising and falling pitches or different tone lengths.

And the fact that they're derived from speech means that these languages are just as rich and complex as spoken language, so speakers can have fluent, nuanced conversations in either one.

 (08:00) to (10:00)



But internally, the brain is actually doing something different depending on which language is being used.

In general, the left hemisphere of the brain does a lot of the heavy lifting when it comes to processing speech, so spoken language has traditionally been thought of as a left-brained skill.

But whistled languages paint a more complicated picture because they involve the right hemisphere, too.

Researchers learned this during a 2015 experiment that studied the comprehension of spoken and whistled language in the people of northeastern Turkey. They used a technique called dichotic listening, which is a way of identifying asymmetries in the brain, based on how people register sounds played simultaneously in each ear.

In general, sound signals that have to travel farther within the brain to reach the area where they get processed register slightly slower than others.

So even though the sounds are played simultaneously, there will be a small difference in the arrival time between the sound from one ear and the sound from the other. But since the signals come so close together, people typically only register the sound that arrives first.

It gets a little confusing here because the brain isn't wired how you might expect. Signals from the left side of your body often go to the right side of the brain, and vice versa. Including sounds.

So, when a dichotic listening task uses speech sounds, people usually hear what's played on the right side, since the right ear has more of a direct path to the language areas on the left side.

And that's exactly what the researchers found in this study too. As 31 whistle-speakers listened to sounds from spoken Turkish, they identified the ones from their right ear more frequently.

But that wasn't the case with whistled language. When the same participants listened to whistled speech, they identified the sounds from their left and right ears about equally.

And that result suggests that language processing may not be inherently asymmetrical.

 (10:00) to (12:00)



It's just that, by focusing only on one kind of language for so long—the spoken kind—we seem to have formed ideas about language processing that were overly simple.

In reality, the reason language has traditionally been labeled as left-brained may have more to do with what we're hearing than the way we're wired.

And it makes sense: The left side of the brain processes information that's received on relatively short time scales. And in spoken language, lots of cues come in on short time scales, like the rhythm of syllables or the transitions between sounds, so it's mostly processed on the left side.

But in whistled speech, a lot of that information gets coded into musical cues like pitch and melody, which have a longer cadence. And the left side of the brain doesn't deal with that—the right side does. That likely explains why whistle-speakers use both sides of their brain to process whistled language.

Not only does this tell us that language comprehension isn't strictly a left-brained activity, it highlights the fact that spoken language isn't just a string of syllables. It consists of all sorts of different cues, and certain parts of the brain may be more or less involved in parsing them, depending on what those cues are. It also highlights the incredibly diverse ways that we use language to transmit ideas between people.

BRIT GARNER:
Cool! So you could learn coding or whistling, but does learning them make you any smarter?

Well, there are some claims out there that knowing more than one language gives your brain a bit of a leg up. And there's even some research to back up the idea. But, as usual, the truth isn't that simple. Here's Hank with more.

HANK GREEN:
Parents were once warned against raising children to speak more than one language. It's bad for kid's cognitive development, they were told, and will result in bad grades and a lower IQ!

And that ridiculous claim is still sometimes repeated, especially here in the United States. But times have mostly changed.

Now, if you believe the headlines, being bilingual makes you smarter and more creative.

 (12:00) to (14:00)



And those headlines don't come from nowhere. There is research which suggests that bilingualism provides some specific cognitive advantages.

And you can hardly blame the press for covering these studies, because it's such an appealing idea—teach your child French and you get a better child! More creativity, multitasking, and academic performance in other subjects all for free!

But if that sounds too good to be true, that's because it is. You see, there are also studies that don't find an advantage. Those don't get the same excited coverage.

In this case, though, the media aren't really the ones to blame. When it comes to the effects of bilingualism on the brain, there's confusion and bias on the scientific side, too. And it all goes to show just how hard it can be to understand what really goes on in our heads.

Learning another language definitely has benefits that's no one can argue with. Like, for example, you'll know another language! And it even makes sense that it could benefit your brain in other ways.

The main benefit is thought to be to executive functions—the processes that control complex cognitive tasks like attention, problem solving, planning, and so on.

And that hypothesis isn't unreasonable. It's thought that these processes are kind of like muscles: the more you use them, the better you get at them.

And research has found that all sorts of cognitively challenging activities improve executive functions. Like, playing video games can make you better at assessing risks and placing bets. And music training can improve your ability to focus on specific tasks.

Since juggling two or more languages in your brain is cognitively challenging in a lot of ways, it could have similar positive effects. Constantly switching between vocabularies could help you be a better multitasker, for example, if it made you generally better at quickly shifting your brain from one thing to another.

But more than one analysis of the research has found that the evidence for such benefits is weak and inconsistent.

For example, a 2015 review in the journal Cortex concluded that over 80% of the tests conducted over 4 years of studies don't show a bilingual advantage. 

 (14:00) to (16:00)



Those that did had serious problems with their methodology—like, they had small sample sizes or inadequate controls.

But there's a more foundational problem with the published research on bilingualism: it doesn't tell the whole story.

This was pointed out by a study published in 2014 in the journal Psychological Science. The researchers started by looking at the research presented at conferences from 1999 to 2013. Roughly half these presentations found some advantage for bilingualism and half didn't.

Then, they looked at which ended up getting published in journals, and found something striking: 68% of the positive studies got published, while only 29% of the negative ones did.

The published and unpublished studies didn't consistently differ based on sample size, experimental tests used, or statistical power. A study simply had a better chance of getting published if it supported the idea that bilingualism gives people a cognitive boost, and a worse chance of getting published if it showed the opposite, regardless of the quality of the work.

This is a phenomenon known as publication bias, and it's not unique to this situation, it's not unique to psychology. It's a pervasive issue scientists from all fields are grappling with because it can undermine the research that is published.

For example, a 2018 meta-analysis of over 150 studies on adults did find bilinguals were slightly better at some executive functions. But those advantages disappeared when the researchers corrected for publication bias.

Now, it's important to point out that none of this amounts to proof that there are no cognitive advantages to bilingualism. But it's clearly going to take a lot more work to figure out if there are, and if so, whether any of them are unique, or if studying Japanese is basically the same as playing Minecraft, from your brain's perspective.

This also applies to another often-repeated claim about bilingualism: that it can delay the onset of dementia.

 (16:00) to (18:00)



Again, this idea seems reasonable at first glance, as other complex cognitive activities do seem to prevent or delay dementia.

But, a 2015 review of the literature found that the effects of bilingualism on dementia are very inconsistent. And that's not all. There were some suspicious patterns in the research methods.

You see, prospective studies—the ones that enroll people before they show symptoms and then test them as they age—tended not to show an effect of bilingualism. Positive results were mostly found in retrospective studies, which look at people after they've been diagnosed. Subjects in that kind of study may not be representative of the whole population, and it's harder to pick good controls.

That all suggests that the researchers might have been seeing what they wanted in the data, and having their judgement biased by their expectations.

So not only do studies on bilingualism have issues with publication bias, there may be straight-up bias in many of them. And this all means we really don't know if learning a second language can give you some kind of subtle cognitive advantage or keep your brain healthy as you age.

Still, we can say that learning a language does make you smarter. No matter what, you're going to know something you didn't know before.

So in that sense, of course it makes you smarter. And it's not going to hurt you, like they thought in the old days. Not only that, with your new fluency, you can experience whole new bodies of literature and arts, travel to interesting places, and talk to more people.

So yeah, being able to speak multiple languages has lots of benefits, even if it's not boosting your brain indirectly.

BRIT GARNER:

Although learning more than one language won't give you an edge brain-wise, you are still learning something that you didn't know before. And if you've got your heart set on learning another language, well, you didn't hear it from me, but having a drink might actually help you on your way. Sort of.

Here's Anthony again with more.

ANTHONY BROWN:

We all know that having a little bit to drink can make you a little loose lipped. Like, you're having a few beers while watching the Packers game with your buddies, and before you know it, you're screaming "Tom Brady is no Aaron Rodgers" at the top of your lungs. Whoops.

 (18:00) to (20:00)



Well, we all know that last statement is true. Even if some people don't want to accept it. And it turns out a bit of booze might actually improve your language skills more generally. That is, if you're speaking a foreign tongue.

Drunk people aren't exactly known for their language skills, since, you know, slurring doesn't count as articulate speech. But, strangely enough, studies do suggest that drinking might help you master the complexities of speaking that new language you've been trying to pick up.

For instance, take a study first published online in 2017 in The Journal of Psychopharmacology, which looked at the language abilities of 50 students in the Netherlands. All of the participants natively spoke German, but were trying to learn Dutch, and had to pass an exam showing they could speak the language to get into their course. As part of the study, they were asked to rate how good they thought their Dutch language skills were, and complete the Rosenburg Self Esteem Scale, designed to measure, you guessed it, self esteem.

Then, they either got a healthy glass of water, or a vodka with bitter lemon; enough booze to give them a blood alcohol concentration of around 0.04%. Once served, they had 10 minutes to finish the drink. Then 15 minutes later, when the alcohol had started to make its way into their bloodstream, the language test began. They were told to verbally argue for or against animal testing, in Dutch, for two minutes.

And it turned out that the group that drank spoke Dutch better. Now I know what you're thinking. But no, they didn't just think they did better because they were buzzed, like your friend thinks they're so much better at karaoke after 3 tequila shots. They rated their own language skills about the same as when they weren't drunk, and the self esteem scores weren't significantly different, either.

It was actually other, native Dutch speakers who said they spoke the language better, and in particular, those judges noted the tipsy participants sounded more natively Dutch. And though there isn't a ton of work on this topic specifically, other research does seem to confirm this idea that pronunciation of a foreign language improves with a little buzz, or, at least, it doesn't get worse, like pronunciation in your native tongue does.

 (20:00) to (22:00)



That might sound kind of ridiculous, but when we think about some of the relaxing qualities of alcohol, it kinda makes sense. You see, you went off about Aaron Rodgers being better than Tom Brady after a few drinks, still no argument there, because booze acts as a multi-purpose wet blanket in the brain.

Specifically, it increases the effectiveness of GABA, a neurotransmitter that generally quiets the chatter between neurons. That includes the neurons you need to activate to stop yourself from saying something you'll regret. But such signal dampening can also lessen anxiety. For example, a 2008 study found that self-reported ratings of intoxication negatively correlated with the activation of brain areas associated with fear response; namely, the limbic system.

Many language learners suffer from foreign language anxiety, which is a feeling of tension or apprehension associated with second language contexts. Basically, they think they're terrible, so they get discouraged and subsequently do worse. In severe cases, they might even shy away from speaking the language entirely.

And it seems like a bit of booze can dull your fears about making mistakes, which might mean you make fewer of them. Or, there might be some other, as of yet unidentified reason drinking helps you sound more natural. But that will take more research.

Alcohol might be able to help you learn the language, too. Of course, conventional wisdom would say drinking isn't great for learning for, well, a lot of reasons. But perhaps the biggest is its reputation for impairing memory.

I mean, there are whole Hollywood blockbusters with plots centered around binge drinking-induced blackouts. And research has shown that even at lower levels, alcohol can mess with memory storage. But that's not the whole story.

Some research suggests that a drink might actually help you remember things; specifically, things you did before you started to drink. A surprising study published in 2017 found that people who drank to their hearts' content after a memorizing task remembered more than those who stayed sober.

 (22:00) to (24:00)



And that seemed to be because of alcohol's memory tampering abilities, or what psychologists refer to as a period of reduced memory encoding.

The basic gist is that because you remember less when you're wasted, your brain does an extra good job locking in the stuff that happened before alcohol started interfering with your memory. Which is basically an argument for following your study sessions with a couple of well deserved beers.

But, it's important to note that even if a little drinking can improve your pronunciation or even help you remember what was on those flashcards, this is definitely not a case where more is better. These studies all used pretty low doses of alcohol. So, if you're hoping to get a linguistic boost, just a drink or two will do. Besides, you should always drink responsibly!

BRIT GARNER:
But where in our brains does all this language learning actually take place? Well, about 150 years ago, scientists thought it was in two specific parts of the brain. But it turns out they might not have gotten the whole picture. Let Hank give you the rundown.

HANK GREEN:
Scientists have been studying human language for centuries, and for good reason. Our language abilities seem to far exceed those of any other living thing, so understanding how we communicate is essential to understanding what makes us so powerful a species.

Plus, there are lots of people who have specific disabilities with language—what doctors call aphasias. And understanding the seat of language in the brain could help doctors better treat those conditions.

Of course, if you cracked open a Psych 101 textbook before this decade, you might think we already know what that seat is. About 150 years ago, scientists identified Broca's and Wernicke's areas: two main language centers in the brain. They're often described as the regions responsible for language production and comprehension, respectively.

But at best, that glosses over what these parts actually do. And lingering misconceptions about them not only impair our understanding of how our brains work, they also hamper our ability to effectively treat people who struggle to communicate.

 (24:00) to (26:00)



Way back in the 1850s, one of the biggest debates in psychology was whether the brain had specialized regions. Researchers had recently shown that there was some degree of this—like, that the brainstem did different stuff than the rest of the brain. But many thought that within the larger regions, all of the tissue was equally important for everything—kind of like how there isn't just one part of your liver that breaks down alcohol.

The research of Pierre Paul Broca and Carl Wernicke played a huge part in changing that. It all started in 1861 when Broca, a French physician, met a patient named Louis Victor Leborgne. Though, everyone called him "Tan" because that was the only word he could say.

Before he was 30, he communicated normally. Then, seemingly out of the blue, it was just "tan." He'd usually say it twice, "Tan tan." So he was kind of a real life Hodor. Though probably, his speech impediment wasn't from some messed-up magical thing.

And what Broca found really interesting was that it seemed like Tan could still understand what other people said and retained other intellectual abilities. So when Tan died, Broca performed an autopsy. And he noted that a small part of Tan's brain—a spot in front of the left ear—was damaged. A few months later, he had another patient with similar language issues and similar brain damage.

That led Broca to conclude that this region was the part of the brain responsible for producing speech. He called it the center for articulated language, but it became known as Broca's area.

Then, about a decade later, Carl Wernicke seemed to discover the part of the brain responsible for comprehending language. His patients also struggled to communicate, but in a very different way—they basically spoke gibberish and couldn't parse spoken or written language, and they did not have damage to Broca's area.

Instead, their troubles appeared to be caused by damage to a region a bit further back, where the temporal lobe meets the parietal lobe—what we now call Wernicke's area.

 (26:00) to (28:00)



Broca and Wernicke's findings together cemented the idea that there were specific regions for certain brain functions. And over the past century, the research of these two pioneers has formed the foundation of the neuroscience of language. Even today, it's often said that Broca's area is where language is produced, while Wernicke's area is where language is understood. But that's just kinda wrong.

Broca's area doesn't seem to play a big role in physically producing speech. Broca got that idea because his patients had such severe language deficits without appearing to lose the ability to understand what has been said to them. But it turns out he missed something important when he examined their brains because he didn't dissect them to look deeper.

Luckily, though, he did preserve those brains, so researchers were able to re-examine them with high-resolution MRI scans in 2007. That's awesome!

What those scans revealed was a lot more damage, and it was likely the damage to areas outside of Broca's area that caused their severe and lasting speech deficits. More recent research on Broca's area suggests that it's specialized for one specific part of language, namely syntax: how words are arranged to form coherent phrases and sentences.

In healthy brains, Broca's area seems to monitor for syntax errors, like it lights up when people read sentences with poor grammar but not ones with spelling mistakes. Damage to that area alone doesn't seem to permanently impair the ability to speak words. People with such damage can generally say a few of the words that they're aiming for, they just struggle to put together a full complex sentence.

Also, because Broca's area is involved in syntax, it is involved in language comprehension. For example, studies have found that stimulating it electrically, and therefore throwing off the natural firing of neurons, can make it harder for people to understand complex verbal instructions or what to do when what they're told clashes with written instructions.


 (28:00) to (30:00)



Similarly, it turns out Wernicke's area isn't the end-all and be-all of language comprehension. Wernicke didn't actually make that claim, mind you. He proposed that it was specialized for the sounds of language, and that there was another region where concepts were processed. And that's actually not far off.

See, people with damage to Wernicke's area generally string together nonsense words, and they expect everyone to understand them. And that's probably because they mix up sounds. Now unless you're a professional orator, you've probably done this at some point. Like I've probably done this while recording this episode, like "can you mask me the pill?"

But you also probably caught yourself as you said it and realized the wrong thing came out. When this happens my son points at me and he laughs and he says, "You said the wrong word!" 

That same ability runs as a kind of simulator just before you speak to plan the sounds so they come out right. And that's what Wernicke's area seems to actually do; It processes the sounds of words, whether the person is listening, reading, or speaking. 

That's why, unlike Broca's area, Wernicke's area becomes active when you read spelling and grammar errors, and people with damage there often swap in the wrong sounds or even whole words when speaking. Though they can't tell they're doing it, so they only get upset once other people seem confused. 

It's not that they can't understand language, in fact, the ability to understand the meaning of words seems to have little to do with Wernicke's area. Instead, issues with word comprehension are tied to damage that occurs in the front of the temporal lobe and in both hemispheres.

But looking back, of course, it's hard to blame people for thinking that Broca's and Wernicke's areas were the seats of language in the brain. They didn't have the techniques we do now for examining brains in living people, and connecting specific Aphasia symptoms to brain damage is tough because patients tend to have damage to multiple areas and present with a number of overlapping symptoms.

Now scientists are putting together a more accurate map in the parts of the brain involved in language.

 (30:00) to (31:05)



There are still some open questions about how these brain regions work, and how they work with each other and other parts of the brain, but we do know that there's way more to language than just these two areas.

The upside to all this is that even when there's damage to these so-called "language centers", there still may be therapies that can help. Ultimately, the more we understand about the neuroscience of language, the closer we will get to effectively treating all sorts of Aphasias.

BRIT GARNER:
I mean, languages can be pretty complex, so it's no surprise that more parts of the brain are involved in that process.  Which goes to show how impressive that gray matter inside our head really is, and we have episodes showcasing just that.

We have episodes on hacking our brain to make food taste better and how babies might actually change your brain, and a bunch of others if you're looking for something to watch next. And if you'd like to help us keep making those videos, you can head over to patreon.com/scishowpsych

[OUTRO MUSIC]