Previous: Reading Assignments: Crash Course Study Skills #2
Next: Crash Course Film Production Preview



View count:253,150
Last sync:2023-01-01 02:15
Today we’re going to step back from hardware and software, and take a closer look at how the backdrop of the cold war and space race and the rise of consumerism and globalization brought us from huge, expensive codebreaking machines in the 1940s to affordable handhelds and personal computers in the 1970s. This is an era that saw huge government funded projects - like the race to the moon. And afterward, a shift towards the individual consumer, commoditization of components, and the rise of the Japanese electronics industry.

Pre-order our limited edition Crash Course: Computer Science Floppy Disk Coasters here!

Produced in collaboration with PBS Digital Studios:

Want to know more about Carrie Anne?

The Latest from PBS Digital Studios:

Want to find Crash Course elsewhere on the internet?
Facebook -
Twitter -
Tumblr -
Support Crash Course on Patreon:
CC Kids:
Hi, I'm Carrie-Anne and welcome to Crash Course Computer Science. Earlier in this series, we covered computing history from roughly the dawn of civilization up to the birth of electronic general purpose computers in the mid 1940s. 

A lot of the material discussed over the past 23 episodes, like programming languages and compilers, algorithms and integrated circuits, floppy disks and operating systems, teletypes and screens, all emerged over roughly a 30-year period: from the mid 1940s, up to the mid 1970s.

This is the era of computing before companies like Apple and Microsoft existed, and long before anyone Tweeted, Googled, or Ubered. It was a formative period, setting the stage for personal computers, the world wide web, self driving cars, virtual reality, and many other topics we'll get to in the second half of this series.

Today, we're going to step back from circuits and algorithms, and review this influential period. We'll pay special attention to the historical backdrop of the Cold War, the space race, and the rise of globalization and consumerism.

[Theme Plays]

Pretty much immediately after World War II concluded in 1945, there was tension between the world's two new superpowers: the United States, and the USSR. The Cold War had begun, and with it, massive government spending on science and engineering. 

Computing, which had already demonstrated its value in wartime efforts like the Manhattan Project, and code breaking Nazi communications, was lavished with government funding. They enabled huge, ambitious computing projects to be undertaken, like ENIAC, EDVAC, ATLAS, and WHIRLWIND, all mentioned in previous episodes.

This spurred rapid advances that simply weren't possible in the commercial sector alone, where projects were generally expected to recoup development costs through sales. This began to change in the early 1950s, especially with Eckert and Mauchley's UNIVAC I, the first commercially successful computer. Unlike any ENIAC or ATLAS, this wasn't just one single computer, it was a model of computer. In total, more than forty were built.

Most of these UNIVAC's went to government offices or large companies, which was part of the growing military industrial complex in the United States, with pockets deep enough to afford the cutting edge. Famously a UNIVAC I built for the US Atomic Energy Commission was used by CBS to predict the results of the 1952 US Presidential Election. With just one per cent of the vote the computer correctly predicted an Eisenhower landslide while pundits favoured Stevenson. It was a media event that helped propel computing to the forefront of the public's imagination.

Computing was unlike machines of the past which generally augmenting human physical abilities. Trucks allowed us to carry more, automatic looms wove faster, machine saws were more precise, and so on for a bunch of contraptions that typified the industrial revolution. Computers on the other hand could augment human intellect.

This potential wasn't lost on Vannevar Bush who in 1945 published an article on the hypothetical computing device he called the MEMEX. This was "a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement for his memory." He also predicted that "wholly new forms of encyclopedia will appear, ready-made with a mesh of associative trails running through them". Sound familiar?

MEMEX directly inspired several subsequent game-changing systems like Ivan S(?~3:12) Sketchpad, which we discussed last episode and Doug Englebart's online system, which we will cover soon.

Vannevar Bush was the head of the US Office of Scientific Research and Development which was responsible for funding and coordinating the scientific research during World War II. With the Cold War brewing, Bush lobbied for the creation of a peace-time equivalent, the National Science Foundation, formed in 1950. To this day the NSF provides federal funding to provide scientific funding in the United States and it is a major reason the US has continued to be a leader in the technology sector.

It was also in the 1950s that consumers started to buy transistor powered gadgets. Notable among them was the transistor radio, which was small, durable, battery-powered, and it was portable, unlike the vacuum tube based radio sets from the 1940s and before. It was a runaway success; the Furby or iPhone of its day.

The Japanese government, looking for industrial opportunities to bolster their post-war economy, soon got in on the action, licensing the rights to transistors from Bell Labs in 1952, helping launch the Japanese semi-conductor and electronics industry.

In 1955 the first Sony product was released, the TR-55 transistor radio. Concentrating on quality and price, Japanese companies captured half the US market for portable radios in just five years. This planted the first seeds of a major industrial rivalry in the decades to come.

In 1953 there were only around 100 computers on the entire planet and at this point the USSR was only a few years behind the west in computing technology, completing their first programmable computer in 1950. But the Soviets were way ahead in the burgeoning space race. Let's go to the Thought Bubble.

The Soviet's launched the world's first satellite into orbit, Sputnik I, in 1957 and a few years later in 1961, Soviet cosmonaut Yuri Gagarin became the first human in space. This didn't sit well with the American public and prompted President Kennedy a month after Gagarin's mission to encourage the nation to land a man on the moon within the decade. And it was expensive. NASA's budget grew almost ten-fold, peaking in 1966 at roughly 4.5% of the US's Federal Budget. Today it's around half a percent.

NASA used this funding to tackle a huge array of enormous challenges. This culminated in the Apollo program, which, at its peak, employed roughly 400,000 people, further support by over 20,000 universities and companies.

One of these huge challenges was navigating in space. NASA needed a computer to process complex trajectories and issue guidance commands to the spacecraft, for this, they built the Apollo Guidance Computer. There were three significant requirements. First the computer had to be fast, no surprise there, second it had to be small and lightweight there was not enough room in the spacecraft and every ounce is precious when you're flying a quarter million miles ot the moon, and finally, it had to be really, really, ridiculously reliable. This is super important in a space craft where there's lots of vibrations, radiation and temperature change and there's no running to Best Buy if something breaks. The technology of the era, vacuum tubes and discrete transistors, just weren't up to the task.

So NASA turned to a brand new technology, integrated circuits, which we discussed a few episodes ago. The Apollo Guidance Computer was the first computer to use them, a huge paradigm shift. NASA was also the only place that could afford them. Initially, each chip cost around $50, and the guidance computer needed thousands of them. But, by paying that price the Americans were able to beat the Soviets to the moon.

Thanks thought bubble!

Although the Apollo guidance computer is credited with spurring the development and adoption of integrated circuits it was a low volume product, there were only 17 Apollo missions after all. It was actually military applications, especially the Minute Man and Polaris nuclear missile systems that allowed integrated circuits to become a mass-produced item. This rapid advancement was further accelerated by the US building and buying huge powerful computers, often called supercomputers because they were frequently ten times faster than any other computer on the planet upon their release.

But these machines, built by companies like (?~6:59) and IBM were also super in cost and soon pretty much only governments could afford to buy them. In the US these machines went to government agencies like the NSA and government research labs like (?~7:10) and Los Almos National Laboratories.

Initially, the US semiconductor industry boomed, buoyed by high-profit government contracts. However, this meant that most US companies overlooked the consumer market, where profit margins were small.

The Japanese semiconductor industry came to dominate this niche. By having to operate with lean profit margins in the 1950s and 60s the Japanese have invested heavily in manufacturing capacity to achieve economies of scale, in research to improve economy and yield and in automation to keep manufacturing costs low.

In the 1970s with the space race and Cold War subsiding, previously juicy defence contracts began to dry up and American semiconductor and electronics companies found it harder to compete. It didn't help that many computing components had been commoditized. DRAM was DRAM so why buy expensive Intel memory when you could buy the same chip for less from Hitachi.

Throughout the 1970s, US companies began to downsize, consolidate or outright fail. Intel had to lay off a third of its workforce in 1974 and even the storied Fairchild semiconductor was acquired in 1979 after near bankruptcy. To survive many of these companies began to outsource their manufacturing in a bid to reduce costs. Intel withdrew from its main product category, memory ICs, and decided to refocus on processors which ultimately saved the company.

This lull in the US electronics industry allowed Japanese companies like Sharp and Casio to dominate the breakout computing product of the 1970s, hand-held electronic calculators. By using integrated circuits, these could be made small and cheap. They replaced expensive desktop adding machines you'd find in offices. For most people it was the first time they didn't have to do math on paper or use a slide-rule. They were an instant hit, selling by the millions.

This further drove-down the cost of integrated circuits and led to the development and widespread use of microprocessors, like the Intel 4004 we've discussed previously. This chip was built by Intel in 1971 at the request of Japanese calculator company Busicom. Soon Japanese electronics were everywhere, from televisions and VCRS to digital wristwatches and Walkmans.

The availability of inexpensive microprocessors formed entirely new products like video arcades. The world got Pong in 1972 and Breakout in 1976. As cost continued to plummet, soon it became possible for regular people to afford computing devices.

During this time we see the emergence of the first successful home computers, like the 1975 ALTAIR 8800 and also the first home gaming consoles like the Atari 2600 in 1977.

Home, let me repeat that, home. That seems like a small thing today, but this was the dawn of a whole new ear in computing. In just three decades computers had evolved from machines where you could literally walk inside of the Super U, assuming you had government clearance, to the point where a child could play with a handheld toy containing a microprocessor many times faster.

Critically, this dramatic evolution would not have been possible without two powerful forces at play: governments and consumers. Government funding like the United States provided during the Cold War enabled early adoption of many later computer technologies. This funding helped float entire industries relating to computing long enough for the technology to mature and become commercially feasible. Then businesses, and ultimately consumers, provided the demand to take it mainstream.

The Cold War may be over but this relationship continues today. Governments are still funding science research, intelligence agencies are still buying supercomputers, humans are still being launched into space, and you're still buying TVs, Xboxes, Playstations, laptops, and smart phones, and for these reasons computing continues to advance at a lightning pace.

I'll see you next week.

[theme music]

Crash Course Computer Science is produced in association with PBS Digital Studios. At their channel, you can check out a playlist of shows like Physics Girl, Deep Look, and PBS Space Time. This episode was filmed at the Chad and Stacy Emigholz Studio in Indianapolis, Indiana, and it was made with the help of all of these nice people, and our wonderful graphics team, Thought Café. That's where we're gonna have to halt and catch fire. See you next week.

[theme music]