crashcourse
The Computer and Turing: Crash Course History of Science #36
YouTube: | https://youtube.com/watch?v=3xdmEwTIsd0 |
Previous: | The Birth of Off Broadway: Crash Course Theater #47 |
Next: | Evaluating Evidence: Crash Course Navigating Digital Information #6 |
Categories
Statistics
View count: | 202,424 |
Likes: | 5,368 |
Comments: | 241 |
Duration: | 11:54 |
Uploaded: | 2019-02-11 |
Last sync: | 2024-10-06 17:00 |
Citation
Citation formatting is not guaranteed to be accurate. | |
MLA Full: | "The Computer and Turing: Crash Course History of Science #36." YouTube, uploaded by CrashCourse, 11 February 2019, www.youtube.com/watch?v=3xdmEwTIsd0. |
MLA Inline: | (CrashCourse, 2019) |
APA Full: | CrashCourse. (2019, February 11). The Computer and Turing: Crash Course History of Science #36 [Video]. YouTube. https://youtube.com/watch?v=3xdmEwTIsd0 |
APA Inline: | (CrashCourse, 2019) |
Chicago Full: |
CrashCourse, "The Computer and Turing: Crash Course History of Science #36.", February 11, 2019, YouTube, 11:54, https://youtube.com/watch?v=3xdmEwTIsd0. |
Computers and computing have changed a lot over the History of Science but ESPECIALLY over the last 100 years. In this episode of Crash Course History of Science, we have a look at that history around World War Two and how that conflict forced changes in computing.
***
Crash Course is on Patreon! You can support us directly by signing up at http://www.patreon.com/crashcourse
Thanks to the following Patrons for their generous monthly contributions that help keep Crash Course free for everyone forever:
Eric Prestemon, Sam Buck, Mark Brouwer, Bob Doye, Jennifer Killen, Naman Goel, Patrick Wiener II, Nathan Catchings, Efrain R. Pedroza, Brandon Westmoreland, dorsey, Indika Siriwardena, James Hughes, Kenneth F Penttinen, Trevin Beattie, Satya Ridhima Parvathaneni, Erika & Alexa Saur, Glenn Elliott, Justin Zingsheim, Jessica Wode, Kathrin Benoit, Tom Trval, Jason Saslow, Nathan Taylor, Brian Thomas Gossett, Khaled El Shalakany, SR Foxley, Sam Ferguson, Yasenia Cruz, Eric Koslow, Caleb Weeks, Tim Curwick, D.A. Noe, Shawn Arnold, Malcolm Callis, Advait Shinde, William McGraw, Andrei Krishkevich, Rachel Bright, Jirat, Ian Dundore
--
Want to find Crash Course elsewhere on the internet?
Facebook - http://www.facebook.com/YouTubeCrashCourse
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse
CC Kids: http://www.youtube.com/crashcoursekids
***
Crash Course is on Patreon! You can support us directly by signing up at http://www.patreon.com/crashcourse
Thanks to the following Patrons for their generous monthly contributions that help keep Crash Course free for everyone forever:
Eric Prestemon, Sam Buck, Mark Brouwer, Bob Doye, Jennifer Killen, Naman Goel, Patrick Wiener II, Nathan Catchings, Efrain R. Pedroza, Brandon Westmoreland, dorsey, Indika Siriwardena, James Hughes, Kenneth F Penttinen, Trevin Beattie, Satya Ridhima Parvathaneni, Erika & Alexa Saur, Glenn Elliott, Justin Zingsheim, Jessica Wode, Kathrin Benoit, Tom Trval, Jason Saslow, Nathan Taylor, Brian Thomas Gossett, Khaled El Shalakany, SR Foxley, Sam Ferguson, Yasenia Cruz, Eric Koslow, Caleb Weeks, Tim Curwick, D.A. Noe, Shawn Arnold, Malcolm Callis, Advait Shinde, William McGraw, Andrei Krishkevich, Rachel Bright, Jirat, Ian Dundore
--
Want to find Crash Course elsewhere on the internet?
Facebook - http://www.facebook.com/YouTubeCrashCourse
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse
CC Kids: http://www.youtube.com/crashcoursekids
The history of computer science is heckin' cool. It features the upending of basic questions, like: what is information? It has biological oomph; only this story's war-hero scientist, Alan Turing, was punished, not celebrated, after helping the Allies win World War Two. And the history of computer science raises profound questions about technology and society, like: how do we know that our big complex beautiful brains aren't really just big complex... computing machines? And, if we can one day build machines that think as fast as humans, will we have to grant them human rights?
[Intro music plays]
Questions about thinking machines are relatively recent in history. But all kinds of doing machines are not, and some of this doing involves solving mathematical problems and other high-level functions. Some time before 60 BCE, the Greeks constructed an analog computer now called the Antikythera mechanism. Using many gears, the mechanism may have been used to predict eclipses or other astronomical events. But the mechanism appears to have been a one-off. So historians often give credit for the first mechanical computer to the Artuqid-Turkman engineer, Al-Jazarī, who died in CE 1206. We met him way back in episode seven when dude built a robotic musical band and a robot toilet helper! And, Al-Jazarī built an astronomical clock that showed the signs of the zodiac and could be reprogrammed to compensate for changing lengths of the day.
Then, in 1642, French mathematician Blaise Pascal invented a mechanical adding machine that used a collection of rotating numbered wheels, similar to a car's odometer. Our friend from episode seventeen, German mathematician Gottfried Leibniz, built commercial mechanical calculators in the late 1600s. And in 1801, in the early days of the Industrial Revolution, French merchant Joseph Marie Jacquard incorporated the punch card into a textile loom to control patterns--arguably the first industrial use of computing!
But devices like calculators and looms are pretty far from the computers we rely on today. So then the question becomes, like... what is a computer? Well, that word has changed a lot over the years. In fact, up until the 1950s, a "computer" was a person who computes--usually a woman. The basic idea today is that a "computer" is a machine that can be programmed to perform tasks--like math problems--automatically.
For many historians, the dream of a somewhat recognizable modern computer that can be programmed to perform all sorts of calculations without continuous human number-punching, dates back to 1837. That's when British mathematician Charles Babbage fully conceived a digital, programmable--but mechanical--computer called the difference engine. This was a general purpose information processor: it wasn't just for a single task, but for solving general logic problems.
Babbage started working on it, but never finished due to cost overruns and fights with his machinist. But we have his notes and those of his chronicler, Bitish mathematician Ada Lovelace, who wrote the first algorithm intended for processing using a computer--basically, the first computer program!--in 1843. Fun fact, Lovelace was the daughter of Romantic poet Lord Byron!
Another early computer was actually made and put into use in the United States. A young mathematician--inventor named Herman Hollerith combined the old technology of punch cards with the new technology of electrical circuits to produce a sorting and tabulating machine. With his machine, the 1890census was finished in weeks instead of years. Hollerith went on to found the Tabulating Machine Company. And it's still in business today--as the International Business Machines Corporation, or IBM.
But neither Babbage and Lovelace's way-ahead-of-their-time designs nor Hollerith's super-sorter established computing as a science. Some important developments happened in the years before Worl War Two. For example, starting in the late 1920s, influential American engineer Vannevar Bush created an analog computer called a differential analyzer which could solve calculus problems with as many as eighteen different variables.
But the war shoved computer science into the scientific limelight. In the 1930w, British mathematician, linguist, cryptographer, philosopher, and all-around smarty pants Alan Turing laid the foundation for a mathematical science of computing. ThoughtBubble, introduce us:
Turing proposed the aptly named Turing machine--a thought experiment to figure out the limitations of mechanical computation. A Turing machine can theoretically perform an algorithm, or a programmed operation. It's a universal computer. Turing couldn't make an abstract perfect computer, but he could lay out how the logic or writing and reading programs should work and how a relatively simple device could, given enough memory, accomplish any logical operation.
During the war, Turing went to work in the super-secret "Ultra" program at Bletchly Park, which was an estate for British codebreakers. Turing wasn't the only computer innovator at Bletchly. For One thing, eight thousand women worked there! Also, an engineer named Tommy Flowers designed some for-the-time hyper-advanced computers called the Colossus series, which also helped the Allies a lot. And were kept secret until the 1970s! But Turing's job, leading Ultra Hut Number Eight, was to decipher encrypted messages about German naval movements.
The Germans used a device called an Enigma machine to create supposedly unbreakable ciphers, or ways of encoding messages so that only someone with the same cipher could read the message. But Turing broke through, using a computer he built called the bombe, based on a Polish computer. These wartime computers weren't super fast or sophisticated. They were smart ways of automating a lot of dumb tasks. Thanks ThoughtBubble.
After the war, Turing kept working on computers. His 1948 essay "Intelligent Machinery" gave more details on the Turing machine. Then, in 1950, he published "Computing Machinery and Intelligence" in the journal Mind. Go read it, it holds up! Basicallym this article became foundational text in artificial intelligence or AI. Turing famously stated that the appearance of intelligence is proof of it. Turing arrived at this idea by thinking about a limit case: consider a computer that appears truly intelligent, like a human. How would we know whether or not it is intelligent? Turing proposed a game to test the computer: talk to it like a person! This game is called the Turing Test and is still used as a challenge in AI: a human asks questions to both a computer and another human, through a terminal, and tries to guess which is which from their responses. The Turing Test was based on an old party game, in which you did the same thing via written notes and tried to guess which of two unknown people was a man and which a woman.
The Turing Test led to the Church-Turing Hypothesis: computation power is computation power. It doesn't matter if that power comes from electrical circuitry or a human brain, or how fast the individual parts of the machine are. So any machine of sufficient power should be able to do any computation that a brain can do. So... a sufficiently complex machine would be as intelligent as a brain--or more. The only limit to computational power is memory. But in real life no computer--whether brain or series of electrical circuits--has infinite memory. Even more ahead of his time, in his 1950 paper, Turing suggested that--instead of trying to straight-up build a computer as intelligent as an adult human--it would be smarter to build a child's mind and then teach it how to learn on its own. BAM, machine learning!
So what recognition did Turing get for all of his hard work? In 1952, in the course of a police investigation of a burglary at his home, the officials became aware that he was in a relationship with another man, and the British government pressed charges. Turing was convicted of "gross indecency" and sentenced to take libido-lowering hormones. He died in 1954, possibly of suicide by cyanide-poisoned apple, possibly by inhalation of cyanide while working. Either way, one of the greatest minds to ever live died at age forty-one. He was not pardoned until 2016.
But before Turing died, he met with some important folks in the United States... Hungarian-American physicist John von Neumann met Turing in the 1930s and worked on foundational aspects of computer science and AI. Con Neumann proposed the idea of storing a computer program in the computer's memory. So instructions could be stored externally, instead of having to be fixed permanently in a given machine. Turing also met with American mathematician named Claude Shannon during the 1930s, sharing his ideas about the Turing Machine. Shannon, who invented the word "bit" and founded digital computing and circuit design theory while still a graduate student at MIT. And conducted some Turing-like - during World War Two.
But he's most well known for publishing a series of papers after the war that founded information theory, which examines how information is encoded, stored, and communicated. We could do a whole episode on information theory, but some of the effects of Shannon's work were to help transition computers, televisions, and other systems of moving around information from analog to digital. And information theory led to the internet! And over at Harvard, American physicist Howard Aiken worked with the military and IBM to design and build a computer, the Harvard Mark I, in 1944. This device was used by von Neumann to run a program to help design the atomic bomb.
One of the other first programmers of the Mark I was American computer scientist and rear admiral Grace Hopper, who invented one of the first compiler tools to translate programming language into machine code. She then worked on machine-independent programming languages, developing the early programming language COBOL. Computers after Worl War Two quickly became bigger, faster, and more complex--like the US Navy-sponsored Electronic Numerical Integrator and Computer, or ENIAC, 1946, which filled up a large room, and UNIVAC in 1951, which was commercially mass-produced. These general-purpose computers were based on the principles laid out by theorists like Turing, von Neumann, and Shannon, and they used the languages developed by programmers like Hopper. These computers were built using a digital code--binary, with values of only "one" or "zero."
And real-world computing really took off after the invention in 1947 of the solid-state transistor by William Shockley at Bell Laboratories and room-filling "mainframe" computers for businesses. In a later episode, we'll get to back to computers--and introduce one of our very best friends in the history of technology, the Internet. But for now, let's remember that, up until the 1950s, a computer was a person, usually a woman, who was a number cruncher--that is, someone who computes, using a machine.
One of those "computers" who became an engineer who used a computer was African-American rocket scientist Annie Easley. In the era of Jim Crow laws, Easley left Alabama and went to work for NASA in Ohio. She developed computer code for NASA missions for decades. Thus next time--humans finally get to play golf on the moon. It's the birth of air and space travel!
Crash Couse History of Science is filmed in the Dr. Cheryl C. Kinney studio in Missoula, Monatana and it's made with the help of all these nice people. And our animation team is ThoughtCafe. Crash Course is a Complexly production. If you wanna keep imagining the world complexly with us, you can check out some of our other channels like Scishow, Eons, and Sexplanations. And, if you'd like to keep Crash Course free for everybody, forever, you can support the series at Patreon, a crowdfunding platform that allows you to support the content you love. Thank you to all of our patrons for making Crash Course possible with their continued support.
[Intro music plays]
Questions about thinking machines are relatively recent in history. But all kinds of doing machines are not, and some of this doing involves solving mathematical problems and other high-level functions. Some time before 60 BCE, the Greeks constructed an analog computer now called the Antikythera mechanism. Using many gears, the mechanism may have been used to predict eclipses or other astronomical events. But the mechanism appears to have been a one-off. So historians often give credit for the first mechanical computer to the Artuqid-Turkman engineer, Al-Jazarī, who died in CE 1206. We met him way back in episode seven when dude built a robotic musical band and a robot toilet helper! And, Al-Jazarī built an astronomical clock that showed the signs of the zodiac and could be reprogrammed to compensate for changing lengths of the day.
Then, in 1642, French mathematician Blaise Pascal invented a mechanical adding machine that used a collection of rotating numbered wheels, similar to a car's odometer. Our friend from episode seventeen, German mathematician Gottfried Leibniz, built commercial mechanical calculators in the late 1600s. And in 1801, in the early days of the Industrial Revolution, French merchant Joseph Marie Jacquard incorporated the punch card into a textile loom to control patterns--arguably the first industrial use of computing!
But devices like calculators and looms are pretty far from the computers we rely on today. So then the question becomes, like... what is a computer? Well, that word has changed a lot over the years. In fact, up until the 1950s, a "computer" was a person who computes--usually a woman. The basic idea today is that a "computer" is a machine that can be programmed to perform tasks--like math problems--automatically.
For many historians, the dream of a somewhat recognizable modern computer that can be programmed to perform all sorts of calculations without continuous human number-punching, dates back to 1837. That's when British mathematician Charles Babbage fully conceived a digital, programmable--but mechanical--computer called the difference engine. This was a general purpose information processor: it wasn't just for a single task, but for solving general logic problems.
Babbage started working on it, but never finished due to cost overruns and fights with his machinist. But we have his notes and those of his chronicler, Bitish mathematician Ada Lovelace, who wrote the first algorithm intended for processing using a computer--basically, the first computer program!--in 1843. Fun fact, Lovelace was the daughter of Romantic poet Lord Byron!
Another early computer was actually made and put into use in the United States. A young mathematician--inventor named Herman Hollerith combined the old technology of punch cards with the new technology of electrical circuits to produce a sorting and tabulating machine. With his machine, the 1890census was finished in weeks instead of years. Hollerith went on to found the Tabulating Machine Company. And it's still in business today--as the International Business Machines Corporation, or IBM.
But neither Babbage and Lovelace's way-ahead-of-their-time designs nor Hollerith's super-sorter established computing as a science. Some important developments happened in the years before Worl War Two. For example, starting in the late 1920s, influential American engineer Vannevar Bush created an analog computer called a differential analyzer which could solve calculus problems with as many as eighteen different variables.
But the war shoved computer science into the scientific limelight. In the 1930w, British mathematician, linguist, cryptographer, philosopher, and all-around smarty pants Alan Turing laid the foundation for a mathematical science of computing. ThoughtBubble, introduce us:
Turing proposed the aptly named Turing machine--a thought experiment to figure out the limitations of mechanical computation. A Turing machine can theoretically perform an algorithm, or a programmed operation. It's a universal computer. Turing couldn't make an abstract perfect computer, but he could lay out how the logic or writing and reading programs should work and how a relatively simple device could, given enough memory, accomplish any logical operation.
During the war, Turing went to work in the super-secret "Ultra" program at Bletchly Park, which was an estate for British codebreakers. Turing wasn't the only computer innovator at Bletchly. For One thing, eight thousand women worked there! Also, an engineer named Tommy Flowers designed some for-the-time hyper-advanced computers called the Colossus series, which also helped the Allies a lot. And were kept secret until the 1970s! But Turing's job, leading Ultra Hut Number Eight, was to decipher encrypted messages about German naval movements.
The Germans used a device called an Enigma machine to create supposedly unbreakable ciphers, or ways of encoding messages so that only someone with the same cipher could read the message. But Turing broke through, using a computer he built called the bombe, based on a Polish computer. These wartime computers weren't super fast or sophisticated. They were smart ways of automating a lot of dumb tasks. Thanks ThoughtBubble.
After the war, Turing kept working on computers. His 1948 essay "Intelligent Machinery" gave more details on the Turing machine. Then, in 1950, he published "Computing Machinery and Intelligence" in the journal Mind. Go read it, it holds up! Basicallym this article became foundational text in artificial intelligence or AI. Turing famously stated that the appearance of intelligence is proof of it. Turing arrived at this idea by thinking about a limit case: consider a computer that appears truly intelligent, like a human. How would we know whether or not it is intelligent? Turing proposed a game to test the computer: talk to it like a person! This game is called the Turing Test and is still used as a challenge in AI: a human asks questions to both a computer and another human, through a terminal, and tries to guess which is which from their responses. The Turing Test was based on an old party game, in which you did the same thing via written notes and tried to guess which of two unknown people was a man and which a woman.
The Turing Test led to the Church-Turing Hypothesis: computation power is computation power. It doesn't matter if that power comes from electrical circuitry or a human brain, or how fast the individual parts of the machine are. So any machine of sufficient power should be able to do any computation that a brain can do. So... a sufficiently complex machine would be as intelligent as a brain--or more. The only limit to computational power is memory. But in real life no computer--whether brain or series of electrical circuits--has infinite memory. Even more ahead of his time, in his 1950 paper, Turing suggested that--instead of trying to straight-up build a computer as intelligent as an adult human--it would be smarter to build a child's mind and then teach it how to learn on its own. BAM, machine learning!
So what recognition did Turing get for all of his hard work? In 1952, in the course of a police investigation of a burglary at his home, the officials became aware that he was in a relationship with another man, and the British government pressed charges. Turing was convicted of "gross indecency" and sentenced to take libido-lowering hormones. He died in 1954, possibly of suicide by cyanide-poisoned apple, possibly by inhalation of cyanide while working. Either way, one of the greatest minds to ever live died at age forty-one. He was not pardoned until 2016.
But before Turing died, he met with some important folks in the United States... Hungarian-American physicist John von Neumann met Turing in the 1930s and worked on foundational aspects of computer science and AI. Con Neumann proposed the idea of storing a computer program in the computer's memory. So instructions could be stored externally, instead of having to be fixed permanently in a given machine. Turing also met with American mathematician named Claude Shannon during the 1930s, sharing his ideas about the Turing Machine. Shannon, who invented the word "bit" and founded digital computing and circuit design theory while still a graduate student at MIT. And conducted some Turing-like - during World War Two.
But he's most well known for publishing a series of papers after the war that founded information theory, which examines how information is encoded, stored, and communicated. We could do a whole episode on information theory, but some of the effects of Shannon's work were to help transition computers, televisions, and other systems of moving around information from analog to digital. And information theory led to the internet! And over at Harvard, American physicist Howard Aiken worked with the military and IBM to design and build a computer, the Harvard Mark I, in 1944. This device was used by von Neumann to run a program to help design the atomic bomb.
One of the other first programmers of the Mark I was American computer scientist and rear admiral Grace Hopper, who invented one of the first compiler tools to translate programming language into machine code. She then worked on machine-independent programming languages, developing the early programming language COBOL. Computers after Worl War Two quickly became bigger, faster, and more complex--like the US Navy-sponsored Electronic Numerical Integrator and Computer, or ENIAC, 1946, which filled up a large room, and UNIVAC in 1951, which was commercially mass-produced. These general-purpose computers were based on the principles laid out by theorists like Turing, von Neumann, and Shannon, and they used the languages developed by programmers like Hopper. These computers were built using a digital code--binary, with values of only "one" or "zero."
And real-world computing really took off after the invention in 1947 of the solid-state transistor by William Shockley at Bell Laboratories and room-filling "mainframe" computers for businesses. In a later episode, we'll get to back to computers--and introduce one of our very best friends in the history of technology, the Internet. But for now, let's remember that, up until the 1950s, a computer was a person, usually a woman, who was a number cruncher--that is, someone who computes, using a machine.
One of those "computers" who became an engineer who used a computer was African-American rocket scientist Annie Easley. In the era of Jim Crow laws, Easley left Alabama and went to work for NASA in Ohio. She developed computer code for NASA missions for decades. Thus next time--humans finally get to play golf on the moon. It's the birth of air and space travel!
Crash Couse History of Science is filmed in the Dr. Cheryl C. Kinney studio in Missoula, Monatana and it's made with the help of all these nice people. And our animation team is ThoughtCafe. Crash Course is a Complexly production. If you wanna keep imagining the world complexly with us, you can check out some of our other channels like Scishow, Eons, and Sexplanations. And, if you'd like to keep Crash Course free for everybody, forever, you can support the series at Patreon, a crowdfunding platform that allows you to support the content you love. Thank you to all of our patrons for making Crash Course possible with their continued support.