crashcourse
The Singularity, Skynet, and the Future of Computing: Crash Course Computer Science #40
YouTube: | https://youtube.com/watch?v=5TNAz1HYg18 |
Previous: | The Parable of the Sower: Crash Course Literature 406 |
Next: | Serpents and Dragons: Crash Course World Mythology #38 |
Categories
Statistics
View count: | 331,404 |
Likes: | 10,775 |
Comments: | 753 |
Duration: | 12:30 |
Uploaded: | 2017-12-21 |
Last sync: | 2024-11-27 19:00 |
Citation
Citation formatting is not guaranteed to be accurate. | |
MLA Full: | "The Singularity, Skynet, and the Future of Computing: Crash Course Computer Science #40." YouTube, uploaded by CrashCourse, 21 December 2017, www.youtube.com/watch?v=5TNAz1HYg18. |
MLA Inline: | (CrashCourse, 2017) |
APA Full: | CrashCourse. (2017, December 21). The Singularity, Skynet, and the Future of Computing: Crash Course Computer Science #40 [Video]. YouTube. https://youtube.com/watch?v=5TNAz1HYg18 |
APA Inline: | (CrashCourse, 2017) |
Chicago Full: |
CrashCourse, "The Singularity, Skynet, and the Future of Computing: Crash Course Computer Science #40.", December 21, 2017, YouTube, 12:30, https://youtube.com/watch?v=5TNAz1HYg18. |
In our SERIES FINALE of Crash Course Computer Science we take a look towards the future! In the past 70 years electronic computing has fundamentally changed how we live our lives, and we believe it’s just getting started. From ubiquitous computing, artificial intelligence, and self-driving cars to brain computer interfaces, wearable computers, and maybe even the singularity there is so much amazing potential on the horizon. Of course there is also room for peril with the rise of artificial intelligence and more immediate displacement of much of the workforce through automation. It’s tough to predict how it will all shake out, but it’s our hope that this series has inspired you to take part in shaping that future. Thank you so much for watching.
Produced in collaboration with PBS Digital Studios: http://youtube.com/pbsdigitalstudios
Want to know more about Carrie Anne?
https://about.me/carrieannephilbin
The Latest from PBS Digital Studios: https://www.youtube.com/playlist?list=PL1mtdjDVOoOqJzeaJAV15Tq0tZ1vKj7ZV
Want to find Crash Course elsewhere on the internet?
Facebook - https://www.facebook.com/YouTubeCrash...
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse
CC Kids: http://www.youtube.com/crashcoursekids
Produced in collaboration with PBS Digital Studios: http://youtube.com/pbsdigitalstudios
Want to know more about Carrie Anne?
https://about.me/carrieannephilbin
The Latest from PBS Digital Studios: https://www.youtube.com/playlist?list=PL1mtdjDVOoOqJzeaJAV15Tq0tZ1vKj7ZV
Want to find Crash Course elsewhere on the internet?
Facebook - https://www.facebook.com/YouTubeCrash...
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse
CC Kids: http://www.youtube.com/crashcoursekids
Hi, I'm Carrie Anne, and welcome to CrashCourse Computer Science. We're here: the final episode! If you've watched the whole series, hopefully you've developed a newfound appreciation for the incredible breadth of computing applications and topics. It's hard to believe we've worked up from mere transistors and logic gates, all the way to computer vision, machine learning, robotics and beyond. We've stood on the shoulders of giants like Babbage and Lovelace, Hollerith and Turing, Eckert and Hopper, Sutherland and Engelbart, Bush and Berners Lee, Gates and the Woz, and many other computing pioneers. My biggest hope is that these episodes have inspired you to learn more about how these subjects affect your life. Maybe you'll even pick up programming or choose a career in computing. It's awesome! It's also a skill of the future.
I said in the very first episode that computer science isn't magic, but it sort of is; knowing how to use and program computers is sorcery of the 21st century. Instead of incantations and spells, it's scripts and code. Those who know how to wield that tremendous power will be able to craft great things, not just to improve their own lives but also their communities and humanity at large. Computing is also going to be literally everywhere, not just the computers we see today sitting on desks and countertops, and also carried in pockets and bags, but inside every object imaginable. Inside all your kitchen appliances, embedded in your walls, nano-tagged in your food, woven into your clothes, and floating around inside your body. This is the vision of the field of ubiquitous computing. In some ways, it's already here, and in other ways, we've got many decades to go. Some might view this eventuality as dystopian, with computers everywhere surveilling us and competing for our attention. But, the late Mark Weiser, who articulated this idea in the 1990s, saw the potential very differently, "for [fifty] years, most interface design, and most computer design, has been headed down the path of the dramatic machine. Its highest idea is to make a computer so exciting, so wonderful, so interesting, that we never want to be without it. A less-traveled path I call the 'invisible;' its highest idea is to make a computer so embedded, so fitting, so natural, that we use it without even thinking about it ... The most profound technologies are those that disappear. They weave themselves int the fabric of everyday life until they are indistinguishable from it." That doesn't describe computing of today, where people sit for hours upon end in front of computer monitors, and social media notifications interrupt us at dinner. But, it could describe computing of the future, our final topic.
[Intro Music]
When people think of computing in the future, they often jump right to artificial intelligence. No doubt, there will be tremendous strides made in AI in the coming years, but not everything will be, or need to be, AI-powered. Your car might have an AI to self-drive, but the door locks might continue to be powered by what are essentially if-statements. AI technology is just as likely to enhance existing devices, like cars, as it is to open up entirely new product categories. The exact same thing happened with the advent of electrical power, lightbulbs replaced candles, but electrification also led to the creation of hundreds of new electrically-powered gadgets. And of course, we still have candles today.
It's most likely that AI will be yet another tool that computer scientists can draw upon to tackle problems. What really gets people thinking, and sometimes sweating, is whether artificial intelligence will surpass human intelligence. This is a really tricky question for a multitude of reasons, including most immediately, what is intelligence? On one hand, we have computers that can drive cars, recognize songs with only a few seconds of audio, translate dozens of languages, and totally dominate at games like chess, Jeopardy, and Go. That sounds pretty smart! But, on the other hand, computers fail at some basic tasks, like walking up steps, folding laundry, understanding speech at a cocktail party, and feeding themselves. We're a long way from artificial intelligence that's as general purpose and capable as a human.
With intelligence being somewhat hard to quantify, people prefer to characterize computers and creatures by their processing power instead, but that's a pretty computing-centric view of intelligence. Nonetheless, if we do this exercise, plotting computers and processors we've talked about in this series, we find that computing today has very roughly the equivalence in calculating power to that of a mouse; which, to be fair, also can't fold laundry, although that would be super cute! Human calculating power is up here, another 10 to the 5, or 100,000 times more powerful than computers today. That sounds like a big gap, but with the rate of change in computing technologies, we might meet that point as early as a decade; even though processor speeds are no longer following Moore's Law, like we discussed in episode 17. If this trend continues, computers would have more processing power/intelligence than the sum total of all human brains combined before the end of this century. And, this could snowball as such systems need less human input with artificial super-intelligence designing and training new versions of itself.
This runaway technological growth, especially with respect to an intelligence explosion, is called the singularity. The term was first used by our old friend from episode 10, John von Neumann, who said, "the accelerating progress of technology and changes in the mode of human life, give the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue." And, von Neumann suggested this back in the 1950s, when computers were trillions of times slower than they are today. Sixty years later though, the singularity is still just a possibility on the horizon. Some experts believe this progress is going to level off, and be more of an S curve than an exponential one, where as complexity increases, it becomes more difficult to make additional progress. Microsoft co-founder, Paul Allen, calle it a complexity brake.
But, as a thought experiment, let's just say that super-intelligent computers will emerge. What that would mean for humanity is a hotly debated topic. There are people who eagerly await it, and those who are already working to stop it from happening. Probably the most immediate effect would be technological unemployment, where workers in many job sectors are rendered obsolete by computers, like AIs and robots, that can do their work better and for less pay. Although computers are new, this effect is not. Remember Jacquard's loom from episode 10? That automated the task of skilled textile workers back in the 1800s, which led to riots. Also, back then, most of the population of the US and Europe were farmers. That's dropped to under 5% today, due to advances like synthetic fertilizers and tractors. More modern examples include telephone switchboard operators being replaces with automatic switchboards in 1960, and robotic arms replacing human painters in car factories in the 1980s. And the list goes on and on. On one hand, these were jobs lost to automation, and on the other hand, clothes, food, bicycles, toys, and a myriad of other products are all plentiful today because they can be cheaply produced thanks to computing. But, experts argues that AI, robots, and computing technologies in general are going to be even more disruptive than these historical examples.
Jobs at a very high level can be summarized along two dimensions. First, jobs can be either more manual, like assembling toys, or more cognitive, like picking stocks. These jobs can also be routine, the same tasks over and over again, or non-routine, where tasks vary and workers need to problem solve and be creative. We already know that routine-manual jobs can be automated by machines; it has already happened for some jobs, and is happening right now for other. What's getting people worries is that the non-routine manual jobs, like cooks, waiters, and security guards, may get automated too. And, the same goes for routine cognitive work, like customer service agents, cashiers, bank tellers, and office assistants. That leaves us with just one quadrant that might be safe, at least for a little while: non-routine cognitive work, which includes professions like teachers and artists, novelists and lawyers, and doctors and scientists. These types of jobs encompass roughly 40% of the US workforce. That leaves 60% of jobs vulnerable to automation. People argue that technological unemployment at this scale would be unprecedented and catastrophic, with most people losing their jobs. Others argue that this will be great, freeing people from less interesting jobs to pursue better ones, all the while enjoying a high standard of living with the bounty of food and products that will result from computers and robots doing most of the hard work. No one really knows how this is going to shake out, but, if history is any guide, it'll probably be OK in the long run. Afterall, no one is advocating that 90% of people go back to farming and weaving textiles by hand. The tough question, which politicians are now discussing, is how to handle hopefully short-term economic disruption for millions of people that might be suddenly out of a job.
Beyond the workplace, computers are also very likely to change our bodies. For example, futurist Ray Kurzweil believes that, "the singularity will allow us to transcend [the] limitations of our biological bodies and brains. We will gain power over our fates. ...We will be able to live as long as we want. We will fully understand human thinking and will vastly extend and expand its reach." Transhumanists see this happening in the form of cyborgs, where humans and technology merge, enhancing our intellect and physiology. There are already brain computer interfaces in use today, and wearable computers, like Google Glass and Microsoft Hololens, are starting to blur the line, too.
There are also people who foresee digital ascension, which, in the words of Jaron Lanier, "would involve people dying in the flesh and being uploaded into a computer and remaining conscious." This transition from biological to digital beings might end up being our next evolutionary step, and a new level of abstraction.
[Music]
Others predict humans staying largely human, but with super-intelligent computers as a benevolent force emerging as a caretaker for humanity, running all the farms, curing diseases, directing robots to pick-up trash, building new homes, and many other functions. This would allow us to simply enjoy our time on this lovely, pale blue dot. Still, others view AI with more suspicion, why would a super-intelligent AI waste its time taking care of us? It's not like we've taken on the role of being the benevolent caretaker of ants. So, maybe this play out like so many Sci-Fi movies where we're at war with computers, our own creation having turned on us. It's impossible to know what the future holds, but it's great that this discussion and debate is already happening. So, as these technologies emerge, we can plan and react intelligently. What's much more likely, regardless of whether you see computers as future friend or foe, is that they will outlive humanity. Many futurists and science fiction writers have speculated that computers will head out into space and colonize the galaxy, ambivalent to time scales, radiation, and all that other stuff that makes long-distance space travel difficult for us humans. And, when the sun is burned up and the Earth is space dust, maybe our technological children will be hard at work exploring every nook and cranny of the universe, hopefully in honor of their parents' tradition to build knowledge, improve the state of the universe, and to boldly go where no one has gone before!
In the meantime, computers have a long way to go, and computer scientists are hard at work advancing all of the topics we talked about over the past forty episodes. In the next decade or so, we'll likely see technologies like virtual and augmented reality, self-driving vehicles, drones, wearable computers, and service robots go mainstream. The internet will continue to evolve new services, stream new media, and connect people in different ways. New programming languages and paradigms will be developed to facilitate the creation of new and amazing software. And new hardware will make complex operations blazingly fast, like neural networks and 3D graphics. Personal computers are also ripe for innovation, perhaps shedding their forty-year old desktop metaphor and being reborn as omnipresent and lifelong virtual assistants. And there's so much we didn't get to talk about in this series, like cryptocurrencies, wireless communication, 3D printing, bioinformatics, and quantum computing. We're in a golden age of computing and there's so much going on, it's impossible to summarize. But, most importantly, you can be a part of this amazing transformation and challenge, by learning about computing and taking what's arguably humanity's greatest invention, to make the world a better place. Thanks for watching.
CrashCourse Computer Science is produced in association with PBS Digital Studios. At their channel, you can checkout a playlist of shows like BrainCraft, Coma Niddy, and PBS Infinite Series. This episode was filmed at The Chad & Stacy Emigholz Studio in Indianapolis, and it was made with the help of all these nice people and our wonderful graphics team, Thought Cafe. Thanks for watching, I'll CPU later.
[Outro Music]
I said in the very first episode that computer science isn't magic, but it sort of is; knowing how to use and program computers is sorcery of the 21st century. Instead of incantations and spells, it's scripts and code. Those who know how to wield that tremendous power will be able to craft great things, not just to improve their own lives but also their communities and humanity at large. Computing is also going to be literally everywhere, not just the computers we see today sitting on desks and countertops, and also carried in pockets and bags, but inside every object imaginable. Inside all your kitchen appliances, embedded in your walls, nano-tagged in your food, woven into your clothes, and floating around inside your body. This is the vision of the field of ubiquitous computing. In some ways, it's already here, and in other ways, we've got many decades to go. Some might view this eventuality as dystopian, with computers everywhere surveilling us and competing for our attention. But, the late Mark Weiser, who articulated this idea in the 1990s, saw the potential very differently, "for [fifty] years, most interface design, and most computer design, has been headed down the path of the dramatic machine. Its highest idea is to make a computer so exciting, so wonderful, so interesting, that we never want to be without it. A less-traveled path I call the 'invisible;' its highest idea is to make a computer so embedded, so fitting, so natural, that we use it without even thinking about it ... The most profound technologies are those that disappear. They weave themselves int the fabric of everyday life until they are indistinguishable from it." That doesn't describe computing of today, where people sit for hours upon end in front of computer monitors, and social media notifications interrupt us at dinner. But, it could describe computing of the future, our final topic.
[Intro Music]
When people think of computing in the future, they often jump right to artificial intelligence. No doubt, there will be tremendous strides made in AI in the coming years, but not everything will be, or need to be, AI-powered. Your car might have an AI to self-drive, but the door locks might continue to be powered by what are essentially if-statements. AI technology is just as likely to enhance existing devices, like cars, as it is to open up entirely new product categories. The exact same thing happened with the advent of electrical power, lightbulbs replaced candles, but electrification also led to the creation of hundreds of new electrically-powered gadgets. And of course, we still have candles today.
It's most likely that AI will be yet another tool that computer scientists can draw upon to tackle problems. What really gets people thinking, and sometimes sweating, is whether artificial intelligence will surpass human intelligence. This is a really tricky question for a multitude of reasons, including most immediately, what is intelligence? On one hand, we have computers that can drive cars, recognize songs with only a few seconds of audio, translate dozens of languages, and totally dominate at games like chess, Jeopardy, and Go. That sounds pretty smart! But, on the other hand, computers fail at some basic tasks, like walking up steps, folding laundry, understanding speech at a cocktail party, and feeding themselves. We're a long way from artificial intelligence that's as general purpose and capable as a human.
With intelligence being somewhat hard to quantify, people prefer to characterize computers and creatures by their processing power instead, but that's a pretty computing-centric view of intelligence. Nonetheless, if we do this exercise, plotting computers and processors we've talked about in this series, we find that computing today has very roughly the equivalence in calculating power to that of a mouse; which, to be fair, also can't fold laundry, although that would be super cute! Human calculating power is up here, another 10 to the 5, or 100,000 times more powerful than computers today. That sounds like a big gap, but with the rate of change in computing technologies, we might meet that point as early as a decade; even though processor speeds are no longer following Moore's Law, like we discussed in episode 17. If this trend continues, computers would have more processing power/intelligence than the sum total of all human brains combined before the end of this century. And, this could snowball as such systems need less human input with artificial super-intelligence designing and training new versions of itself.
This runaway technological growth, especially with respect to an intelligence explosion, is called the singularity. The term was first used by our old friend from episode 10, John von Neumann, who said, "the accelerating progress of technology and changes in the mode of human life, give the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue." And, von Neumann suggested this back in the 1950s, when computers were trillions of times slower than they are today. Sixty years later though, the singularity is still just a possibility on the horizon. Some experts believe this progress is going to level off, and be more of an S curve than an exponential one, where as complexity increases, it becomes more difficult to make additional progress. Microsoft co-founder, Paul Allen, calle it a complexity brake.
But, as a thought experiment, let's just say that super-intelligent computers will emerge. What that would mean for humanity is a hotly debated topic. There are people who eagerly await it, and those who are already working to stop it from happening. Probably the most immediate effect would be technological unemployment, where workers in many job sectors are rendered obsolete by computers, like AIs and robots, that can do their work better and for less pay. Although computers are new, this effect is not. Remember Jacquard's loom from episode 10? That automated the task of skilled textile workers back in the 1800s, which led to riots. Also, back then, most of the population of the US and Europe were farmers. That's dropped to under 5% today, due to advances like synthetic fertilizers and tractors. More modern examples include telephone switchboard operators being replaces with automatic switchboards in 1960, and robotic arms replacing human painters in car factories in the 1980s. And the list goes on and on. On one hand, these were jobs lost to automation, and on the other hand, clothes, food, bicycles, toys, and a myriad of other products are all plentiful today because they can be cheaply produced thanks to computing. But, experts argues that AI, robots, and computing technologies in general are going to be even more disruptive than these historical examples.
Jobs at a very high level can be summarized along two dimensions. First, jobs can be either more manual, like assembling toys, or more cognitive, like picking stocks. These jobs can also be routine, the same tasks over and over again, or non-routine, where tasks vary and workers need to problem solve and be creative. We already know that routine-manual jobs can be automated by machines; it has already happened for some jobs, and is happening right now for other. What's getting people worries is that the non-routine manual jobs, like cooks, waiters, and security guards, may get automated too. And, the same goes for routine cognitive work, like customer service agents, cashiers, bank tellers, and office assistants. That leaves us with just one quadrant that might be safe, at least for a little while: non-routine cognitive work, which includes professions like teachers and artists, novelists and lawyers, and doctors and scientists. These types of jobs encompass roughly 40% of the US workforce. That leaves 60% of jobs vulnerable to automation. People argue that technological unemployment at this scale would be unprecedented and catastrophic, with most people losing their jobs. Others argue that this will be great, freeing people from less interesting jobs to pursue better ones, all the while enjoying a high standard of living with the bounty of food and products that will result from computers and robots doing most of the hard work. No one really knows how this is going to shake out, but, if history is any guide, it'll probably be OK in the long run. Afterall, no one is advocating that 90% of people go back to farming and weaving textiles by hand. The tough question, which politicians are now discussing, is how to handle hopefully short-term economic disruption for millions of people that might be suddenly out of a job.
Beyond the workplace, computers are also very likely to change our bodies. For example, futurist Ray Kurzweil believes that, "the singularity will allow us to transcend [the] limitations of our biological bodies and brains. We will gain power over our fates. ...We will be able to live as long as we want. We will fully understand human thinking and will vastly extend and expand its reach." Transhumanists see this happening in the form of cyborgs, where humans and technology merge, enhancing our intellect and physiology. There are already brain computer interfaces in use today, and wearable computers, like Google Glass and Microsoft Hololens, are starting to blur the line, too.
There are also people who foresee digital ascension, which, in the words of Jaron Lanier, "would involve people dying in the flesh and being uploaded into a computer and remaining conscious." This transition from biological to digital beings might end up being our next evolutionary step, and a new level of abstraction.
[Music]
Others predict humans staying largely human, but with super-intelligent computers as a benevolent force emerging as a caretaker for humanity, running all the farms, curing diseases, directing robots to pick-up trash, building new homes, and many other functions. This would allow us to simply enjoy our time on this lovely, pale blue dot. Still, others view AI with more suspicion, why would a super-intelligent AI waste its time taking care of us? It's not like we've taken on the role of being the benevolent caretaker of ants. So, maybe this play out like so many Sci-Fi movies where we're at war with computers, our own creation having turned on us. It's impossible to know what the future holds, but it's great that this discussion and debate is already happening. So, as these technologies emerge, we can plan and react intelligently. What's much more likely, regardless of whether you see computers as future friend or foe, is that they will outlive humanity. Many futurists and science fiction writers have speculated that computers will head out into space and colonize the galaxy, ambivalent to time scales, radiation, and all that other stuff that makes long-distance space travel difficult for us humans. And, when the sun is burned up and the Earth is space dust, maybe our technological children will be hard at work exploring every nook and cranny of the universe, hopefully in honor of their parents' tradition to build knowledge, improve the state of the universe, and to boldly go where no one has gone before!
In the meantime, computers have a long way to go, and computer scientists are hard at work advancing all of the topics we talked about over the past forty episodes. In the next decade or so, we'll likely see technologies like virtual and augmented reality, self-driving vehicles, drones, wearable computers, and service robots go mainstream. The internet will continue to evolve new services, stream new media, and connect people in different ways. New programming languages and paradigms will be developed to facilitate the creation of new and amazing software. And new hardware will make complex operations blazingly fast, like neural networks and 3D graphics. Personal computers are also ripe for innovation, perhaps shedding their forty-year old desktop metaphor and being reborn as omnipresent and lifelong virtual assistants. And there's so much we didn't get to talk about in this series, like cryptocurrencies, wireless communication, 3D printing, bioinformatics, and quantum computing. We're in a golden age of computing and there's so much going on, it's impossible to summarize. But, most importantly, you can be a part of this amazing transformation and challenge, by learning about computing and taking what's arguably humanity's greatest invention, to make the world a better place. Thanks for watching.
CrashCourse Computer Science is produced in association with PBS Digital Studios. At their channel, you can checkout a playlist of shows like BrainCraft, Coma Niddy, and PBS Infinite Series. This episode was filmed at The Chad & Stacy Emigholz Studio in Indianapolis, and it was made with the help of all these nice people and our wonderful graphics team, Thought Cafe. Thanks for watching, I'll CPU later.
[Outro Music]