YouTube: https://youtube.com/watch?v=dRiuhK4a9xE
Previous: Why You Think You Look Better in Selfies
Next: (None)

Categories

Statistics

View count:896
Likes:128
Dislikes:1
Comments:8
Duration:31:05
Uploaded:2021-06-10
Last sync:2021-06-10 19:30
You have a lot of choices if you’re looking to learn a new language, from Spanish to coding, or even whistling! And there are some broad similarities and patterns in the ways our brains process these different forms of communication.

Hosted by: Brit Garner

Original Episodes:
Is Coding a Math Skill or a Language Skill? Neither? Both?
https://youtu.be/xPecMsFmEm4
What Whistled Speech Tells Us About How the Brain Interprets Language
https://youtu.be/7gvfDB5_cz4
Will Learning Another Language Make You Smarter?
https://youtu.be/wf_M5l2y3sM
Want to Speak a Foreign Language Better? Have a Drink
https://youtu.be/mDxSU9LJxOc
What We Often Get Wrong About the Brain's "Language Centers"
https://youtu.be/kjwIgKa560s
----------
Support SciShow Psych by becoming a patron on Patreon: https://www.patreon.com/SciShowPsych

SciShow has a spinoff podcast! It's called SciShow Tangents. Check it out at https://www.scishowtangents.org
----------
Become a Patron and have your name featured in the description of every SciShow Psych episode! https://www.patreon.com/SciShowPsych
----------
Looking for SciShow elsewhere on the internet?
Facebook: http://www.facebook.com/scishow
Twitter: http://www.twitter.com/scishow
Tumblr: http://scishow.tumblr.com
Instagram: http://instagram.com/thescishow
----------
Images:
https://www.istockphoto.com/photo/woman-talking-letters-in-head-coming-out-of-open-mouth-gm486962714-73857403
https://www.storyblocks.com/video/stock/computer-hacking-in-process-cyber-security-concept-b-rnsuel-j2n7ypg4
https://www.storyblocks.com/video/stock/medium-long-shot-of-teacher-explaining-to-pupils-in-a-chinese-language-class-budfdclbsjy0qcgg3

 (00:00) to (02:00)


[INTRO]

BRIT GARNER:
You have a lot of choices if you're looking to learn a new language, from Spanish to coding to even whistling. See, there are some broad similarities and patterns in the ways our brains process these different forms of communication. Let's have a look.

We'll kick things off with computer code, which sounds more like math than a traditional language. But since many folks in the future might end up learning to code at some point, researchers are interested in understanding exactly how are brain interprets it.

Does it see code as a language? And can that help us learn to code more effectively? Well, let me explain.

Scientists generally understand what parts of the brain are involved in talking, writing, and solving math problems. But when it comes to reading and writing computer code, the brain is more of a mystery.

Programming languages like Python and C++ are kind of like languages because they contain words and abbrieviations that follow rules like grammar. But they're also full of symbols and variables which might make them seem similar to algebra.

Many people have wondered: Does the brain process code as a language or as math?

The question may seem abstract, and the answer isn't black and white. But understanding how the brain interprets code can help us teach it better. And it can also give us unique insights into how our brains work.

High level programming languages like Python and C++ provide instructions to computers in a way that both computers and people can understand. They were developed as a happy medium between human language, which is hard for computers to process, and computers native language of binary code, which is really hard for people to comprehend.

And understanding how we understand programming has the potential to reveal how the brain learns novel complex skills.

Humans didn't evolve with special neural regions dedicated to computer programming. After all, C++, Java, and Python didn't even exist 50 years ago.

But now, many people are fluent in coding languages. So studying code cognition can give us clues to how the brain repurposes its circuitry to process weird, novel things its never seen before over the course of evolution, which is really cool.

 (02:00) to (04:00)


Plus, it's practical. We may be headed for a future where lots of people need to learn code—at least to some extent. So it's essential to discover how we can teach coding effectively.

But to accomplish that, we have to figure out how our brains interpret code. And here there have been 2 main schools of thought.

One hypothesis is that your brain categorizes code like it would Swahili or Spanish. That is, it activates the language system: a network of regions involved in linguistic tasks.

The other ideas is that your brain co-ops the neural circuitry we use for some kinds of math, including symbolic math like algebra.

The math-centric parts of the brain are located in the multiple demand system, a larger network of regions involved in all sorts of logic and problem solving tasks.

Researchers have been debating all of this for a while, without a clear answer. Then, in a 2020 study, neuroscientists from the Massachussets Institute of Technology investigated the question using brain imaging.

They recruited 43 proficient coders, put them in an fMRI machine, and had them solve problems shown either as a sentence or using snippets of code.

About half the participants were presented with Python, a popular text-based programming language. The others used ScratchJr, a visual programming language designed for kids, which was expected to have less effect on the language system.

And based on the fMRI results, it turns out that neither the language nor the math camps have the whole story right.

The study found that when most participants read and built their code, the language processing area of their brains wasn't tapped much at all. And the multiple demand system was strongly activated, but not in the regions you;d expect for a purely math-related task.

See, many math tasks involve certain regions of the left hemisphere—at least, for right-handed people, like those in the study. But coding recruited large parts of the multiple demand system in both the right and left hemispheres that are involved in logic and all kinds of complex cognitive tasks.

Essentially, when we read and comprehend code, we use the same parts of the brain we use when we think really hard about solving a challenging problem, not a basic math exercise.

This means your brain doesn't see coding as strictly language or math.

 (04:00) to (06:00)


It’s an uncategorized thing that we process in a unique way, which is pretty neat.

But it makes life more difficult for instructors teaching coding. Because if brains saw code just as words, we could teach it like we teach foreign languages. And if brains saw code just as math, we could teach it like we teach algebra or trigonometry.

But since it interprets code as neither of those, computer science educators will have to get creative and teach coding like it’s something totally new. Which, of course, it is.

Except it gets more complicated than that because it might also be a matter of timing.

The MIT researchers hypothesize that people taught coding as kids might use their language system more compared to adults. That’s because in kids, the language system is still developing, and what their brain sees as language isn’t set in stone yet.

In contrast, the multiple demand system may remain more flexible throughout a person’s life, so it might be more activated when adults learn code.

That means that when we teach coding might make a big difference in how we should teach it.

In addition, learning coding might actually rewire the brain. When we repeatedly perform an action, whether it’s reading books or playing the guitar, the brain forms new and stronger connections between the neurons involved.

The MIT researchers think that people who’ve been coding for decades could form specialized regions in their multiple demand system that are dedicated to Python, or C++, or whatever programming languages we invent in the future. That means we have more to learn if we want to harness that flexibility when we teach.

But it’s also a pretty amazing example of what our brains can do when we invent brand new situations for them to deal with.

But what happens when instead of a string of code, you just have sound? And I'm not talking about the sound of my beautiful voice. No, I'm really not. I'm not talking about that.

I'm talking about whistling. Which, fun fact, I actually can't do.

So, how does our brain tackle that one? Here's Anthony with more.

ANTHONY BROWN:
On every continent besides Antarctica, you can find groups of people who communicate by whistling. And not just like, to get someone's attention or say, "I'm over here!"

 (06:00) to (08:00)


These are full-blown languages.

For a long time, all sorts of researchers have been interested in this form of communication, which seems to blur the lines between language and music. And recently, research by neuroscientists has shown that the way language works in the brain is not exactly what we thought.

Although whistled languages emerged separately all over the world, they tend to have some key features in common. Like, they usually exist among groups of people living in spread-out communities who need to communicate over long distances.

Because conveniently, the high pitch and volume of a whistle can travel a lot farther than regular speech—up to 10 kilometers away, in some cases.

And not only do whistles travel farther, they also stand out more against the sounds of rivers, wind, and animal calls, so they can be a great communication tool in rugged environments. Like, in the mountains of Oaxaca, Mexico, police actually used their whistled language for years before they switched to walkie-talkies.

But these languages don't just replace speech. In many cases, whistled languages are adaptations of the local spoken language.

And that's one of the things that makes them so interesting to neuroscientists. Because, oddly, even though they're two versions of the same language, the brain doesn't process them the same way.

Two whistled languages that have been studied in depth are Silbo, which is based on Spanish and found in the Canary Islands, and a Turkish-based language found in the mountains of Turkey.

Here's a sample of Silbo [sample of Silbo plays]. And here's a sample of the Turkish-based language [sample of Turkish-based whistled language plays].

They both imitate the intonations and stresses of speech, which can easily be reproduced in whistles. Meanwhile, the things that can't be imitated, like vowels and consonants, get translated into cues like rising and falling pitches or different tone lengths.

And the fact that they're derived from speech means that these languages are just as rich and complex as spoken language, so speakers can have fluent, nuanced conversations in either one.

 (08:00) to (10:00)


But internally, the brain is actually doing something different depending on which language is being used.

In general, the left hemisphere of the brain does a lot of the heavy lifting when it comes to processing speech, so spoken language has traditionally been thought of as a left-brained skill.

But whistled languages paint a more complicated picture because they involve the right hemisphere, too.

Researchers learned this during a 2015 experiment that studied the comprehension of spoken and whistled language in the people of northeastern Turkey. They used a technique called dichotic listening, which is a way of identifying asymmetries in the brain, based on how people register sounds played simultaneously in each ear.

In general, sound signals that have to travel farther within the brain to reach the area where they get processed register slightly slower than others.

So even though the sounds are played simultaneously, there will be a small difference in the arrival time between the sound from one ear and the sound from the other. But since the signals come so close together, people typically only register the sound that arrives first.

It gets a little confusing here because the brain isn't wired how you might expect. Signals from the left side of your body often go to the right side of the brain, and vice versa. Including sounds.

So, when a dichotic listening task uses speech sounds, people usually hear what's played on the right side, since the right ear has more of a direct path to the language areas on the left side.

And that's exactly what the researchers found in this study too. As 31 whistle-speakers listened to sounds from spoken Turkish, they identified the ones from their right ear more frequently.

But that wasn't the case with whistled language. When the same participants listened to whistled speech, they identified the sounds from their left and right ears about equally.

And that result suggests that language processing may not be inherently asymmetrical.

 (10:00) to (12:00)


It's just that, by focusing only on one kind of language for so long—the spoken kind—we seem to have formed ideas about language processing that were overly simple.

In reality, the reason language has traditionally been labeled as left-brained may have more to do with what we're hearing than the way we're wired.

And it makes sense: The left side of the brain processes information that's received on relatively short time scales. And in spoken language, lots of cues come in on short time scales, like the rhythm of syllables or the transitions between sounds, so it's mostly processed on the left side.

But in whistled speech, a lot of that information gets coded into musical cues like pitch and melody, which have a longer cadence. And the left side of the brain doesn't deal with that—the right side does. That likely explains why whistle-speakers use both sides of their brain to process whistled language.

Not only does this tell us that language comprehension isn't strictly a left-brained activity, it highlights the fact that spoken language isn't just a string of syllables. It consists of all sorts of different cues, and certain parts of the brain may be more or less involved in parsing them, depending on what those cues are. It also highlights the incredibly diverse ways that we use language to transmit ideas between people.

BRIT GARNER:
Cool! So you could learn coding or whistling, but does learning them make you any smarter?

Well, there are some claims out there that knowing more than one language gives your brain a bit of a leg up. And there's even some research to back up the idea. But, as usual, the truth isn't that simple. Here's Hank with more.

HANK GREEN:
Parents were once warned against raising children to speak more than one language. It's bad for kid's cognitive development, they were told, and will result in bad grades and a lower IQ!

And that ridiculous claim is still sometimes repeated, especially here in the United States. But times have mostly changed.

Now, if you believe the headlines, being bilingual makes you smarter and more creative.

 (12:00) to (14:00)


And those headlines don't come from nowhere. There is research which suggests that bilingualism provides some specific cognitive advantages.

And you can hardly blame the press for covering these studies, because it's such an appealing idea—teach your child French and you get a better child! More creativity, multitasking, and academic performance in other subjects all for free!

But if that sounds too good to be true, that's because it is. You see, there are also studies that don't find an advantage. Those don't get the same excited coverage.

In this case, though, the media aren't really the ones to blame. When it comes to the effects of bilingualism on the brain, there's confusion and bias on the scientific side, too. And it all goes to show just how hard it can be to understand what really goes on in our heads.

Learning another language definitely has benefits that's no one can argue with. Like, for example, you'll know another language! And it even makes sense that it could benefit your brain in other ways.

The main benefit is thought to be to executive functions—the processes that control complex cognitive tasks like attention, problem solving, planning, and so on.

And that hypothesis isn't unreasonable. It's thought that these processes are kind of like muscles: the more you use them, the better you get at them.

And research has found that all sorts of cognitively challenging activities improve executive functions. Like, playing video games can make you better at assessing risks and placing bets. And music training can improve your ability to focus on specific tasks.

Since juggling two or more languages in your brain is cognitively challenging in a lot of ways, it could have similar positive effects. Constantly switching between vocabularies could help you be a better multitasker, for example, if it made you generally better at quickly shifting your brain from one thing to another.

But more than one analysis of the research has found that the evidence for such benefits is weak and inconsistent.

For example, a 2015 review in the journal Cortex concluded that over 80% of the tests conducted over 4 years of studies don't show a bilingual advantage. 

 (14:00) to (16:00)


 (16:00) to (18:00)


 (18:00) to (20:00)


 (20:00) to (22:00)


 (22:00) to (24:00)


 (24:00) to (26:00)


 (26:00) to (28:00)


 (28:00) to (30:00)


 (30:00) to (31:05)