crashcourse
How Engineering Robots Works: Crash Course Engineering #33
Categories
Statistics
View count: | 297,211 |
Likes: | 5,249 |
Comments: | 122 |
Duration: | 11:02 |
Uploaded: | 2019-01-24 |
Last sync: | 2024-12-23 10:45 |
Citation
Citation formatting is not guaranteed to be accurate. | |
MLA Full: | "How Engineering Robots Works: Crash Course Engineering #33." YouTube, uploaded by CrashCourse, 24 January 2019, www.youtube.com/watch?v=uNfUAJBuZ0s. |
MLA Inline: | (CrashCourse, 2019) |
APA Full: | CrashCourse. (2019, January 24). How Engineering Robots Works: Crash Course Engineering #33 [Video]. YouTube. https://youtube.com/watch?v=uNfUAJBuZ0s |
APA Inline: | (CrashCourse, 2019) |
Chicago Full: |
CrashCourse, "How Engineering Robots Works: Crash Course Engineering #33.", January 24, 2019, YouTube, 11:02, https://youtube.com/watch?v=uNfUAJBuZ0s. |
In this episode we looked at robots and the engineering principles of robots. We learned how robots use sensors to interpret their environment, how actuators and effectors allow a robot to manipulate the objects around it to accomplish a task, and how computers coordinate the efforts of the two.
Crash Course Engineering is produced in association with PBS Digital Studios: https://www.youtube.com/playlist?list=PL1mtdjDVOoOqJzeaJAV15Tq0tZ1vKj7ZV
***
RESOURCES:
https://www.wired.com/story/what-is-a-robot/
https://www.superdroidrobots.com/shop/custom.aspx/mining-robots/65/
https://www.pbs.org/newshour/science/new-robot-surgeon-sews-up-pig-intestines
http://stm.sciencemag.org/content/8/337/337ra64
https://www.motoman.com/robotic-welding
https://blog.robotiq.com/bid/72927/What-is-Included-in-Robotic-Welding-Systems
https://www.cnet.com/news/be-afraid-darpa-unveils-terminator-like-atlas-robot/
https://www.theregister.co.uk/2015/01/23/atlas_unplugged_darpas_unterminator_robot_cuts_the_power_cable/
https://www.intorobotics.com/fruit-harvesting-robots/
https://techcrunch.com/2017/06/16/object-detection-api/?guccounter=1
http://www.sci.brooklyn.cuny.edu/~sklar/teaching/boston-college/s01/mc375/mc375-effectors.pdf
https://www.nature.com/articles/d41586-018-05093-1
https://www.universal-robots.com/industries/automotive-and-subcontractors/
***
Crash Course is on Patreon! You can support us directly by signing up at http://www.patreon.com/crashcourse
Thanks to the following Patrons for their generous monthly contributions that help keep Crash Course free for everyone forever:
Eric Prestemon, Sam Buck, Mark Brouwer, Naman Goel, Patrick Wiener II, Nathan Catchings, Efrain R. Pedroza, Brandon Westmoreland, dorsey, Indika Siriwardena, James Hughes, Kenneth F Penttinen, Trevin Beattie, Satya Ridhima Parvathaneni, Erika & Alexa Saur, Glenn Elliott, Justin Zingsheim, Jessica Wode, Kathrin Benoit, Tom Trval, Jason Saslow, Nathan Taylor, Brian Thomas Gossett, Khaled El Shalakany, SR Foxley, Yasenia Cruz, Eric Koslow, Caleb Weeks, Tim Curwick, D.A. Noe, Shawn Arnold, Malcolm Callis, Advait Shinde, William McGraw, Andrei Krishkevich, Rachel Bright, Jirat, Ian Dundore
--
Want to find Crash Course elsewhere on the internet?
Facebook - http://www.facebook.com/YouTubeCrashCourse
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse
CC Kids: http://www.youtube.com/crashcoursekids
Crash Course Engineering is produced in association with PBS Digital Studios: https://www.youtube.com/playlist?list=PL1mtdjDVOoOqJzeaJAV15Tq0tZ1vKj7ZV
***
RESOURCES:
https://www.wired.com/story/what-is-a-robot/
https://www.superdroidrobots.com/shop/custom.aspx/mining-robots/65/
https://www.pbs.org/newshour/science/new-robot-surgeon-sews-up-pig-intestines
http://stm.sciencemag.org/content/8/337/337ra64
https://www.motoman.com/robotic-welding
https://blog.robotiq.com/bid/72927/What-is-Included-in-Robotic-Welding-Systems
https://www.cnet.com/news/be-afraid-darpa-unveils-terminator-like-atlas-robot/
https://www.theregister.co.uk/2015/01/23/atlas_unplugged_darpas_unterminator_robot_cuts_the_power_cable/
https://www.intorobotics.com/fruit-harvesting-robots/
https://techcrunch.com/2017/06/16/object-detection-api/?guccounter=1
http://www.sci.brooklyn.cuny.edu/~sklar/teaching/boston-college/s01/mc375/mc375-effectors.pdf
https://www.nature.com/articles/d41586-018-05093-1
https://www.universal-robots.com/industries/automotive-and-subcontractors/
***
Crash Course is on Patreon! You can support us directly by signing up at http://www.patreon.com/crashcourse
Thanks to the following Patrons for their generous monthly contributions that help keep Crash Course free for everyone forever:
Eric Prestemon, Sam Buck, Mark Brouwer, Naman Goel, Patrick Wiener II, Nathan Catchings, Efrain R. Pedroza, Brandon Westmoreland, dorsey, Indika Siriwardena, James Hughes, Kenneth F Penttinen, Trevin Beattie, Satya Ridhima Parvathaneni, Erika & Alexa Saur, Glenn Elliott, Justin Zingsheim, Jessica Wode, Kathrin Benoit, Tom Trval, Jason Saslow, Nathan Taylor, Brian Thomas Gossett, Khaled El Shalakany, SR Foxley, Yasenia Cruz, Eric Koslow, Caleb Weeks, Tim Curwick, D.A. Noe, Shawn Arnold, Malcolm Callis, Advait Shinde, William McGraw, Andrei Krishkevich, Rachel Bright, Jirat, Ian Dundore
--
Want to find Crash Course elsewhere on the internet?
Facebook - http://www.facebook.com/YouTubeCrashCourse
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse
CC Kids: http://www.youtube.com/crashcoursekids
Someday in the near future, you might be wandering around town at night when out of the blue, you'll stumble across a robot. If that happens, maybe you shouldn't be too surprised. Not so long ago, robots were mostly found in the realms of science fiction. Today, they're vacuuming floors, building cars, and even roaming around the surface of Mars.
Robots are still pretty far from having a human level capacity of intelligence, or even dexterity for that matter, but they're already a big part of many branches of engineering. In the US, over a hundred thousand robots have been added to factory floors since 2010. Knowing what exactly robots are, what they can do, and how they work is more important than you might think. Because soon, they might be living among us.
[Crash Course Engineering intro]
The classic picture of a robot is something with human-like intelligence, and maybe even a humanoid appearance. But like so much else, real-life robots aren't much like what you usually see on TV. Robots come in all shapes and sizes, and they can be built very differently depending on what they're used for.
For example, some of the robots used in mining are made from a camera mounted on a small chassis with wheels. That allows them to enter and inspect mine shafts and even retrieve leftover material from places it's hard to get people in and out of. Meanwhile, in medicine, specialized robotic arms, with a bit of human assistance, can perform precise surgeries through the tiniest incisions.
There's also a difference between robotics and artificial intelligence, or AI, which people sometimes confuse. While some of the concepts are similar, robotics deals with a specific set of ideas, although it does borrow a few from AI. AI deals broadly with the goal of automating decision-making for complex tasks—the kind you can't write a simple set of rules for. That could be everything from playing chess to driving cars. Right now, AI systems tend to have very narrow goals, but the holy grail of AI is to develop a system that can make intelligent decisions about any sort of task using different sources of information.
Our focus will be on robots, which are designed for more specific purposes in the physical world. In engineering terms, a robot is a machine designed to interact with its environment, make an appropriate decision based on those surroundings, and then carry out the jobs related to its goal, all automatically. Which means a true robot doesn't require a human controlling exactly what it does. In general, the field of robotics also deals with machines that do all of the same things, but might require a human operator to carry out some tasks.
Whether fully automatic or not, virtually all robots have a few features in common. First, robots are machines made from materials that occupy physical space, so they're more than just lines of computer code. That's one of the key features that distinguishes them from an AI, although computer engineering is important for robots as well.
Second, most robots have some way of sensing features of their environment, usually by measuring light, sound, or force feedback. They can then use that information to make a decision about what to do next, depending on what they're designed for.
The tasks robots are built for tend to be complex, requiring a sequence of different motions. So an automatic door that opens or shuts in response to whether you stand in front of a sensor wouldn't be considered a robot. The tasks that robots handle are more sophisticated, like welding two intricately shaped pieces of metal together.
Third, to interpret the signals they receive from their environment and coordinate some sort of response, robots have computers built somewhere into their design. These work just like any computer, taking inputs and delivering outputs. But unlike most ordinary software, the software on a robot's computer generates electrical signals that are directly passed on to the robot's hardware, instead of just changing information in a file or on a screen.
That also requires a power source. One option is to physically hook up to the power grid using wires and a socket. For example, the 2013 version of the Atlas robot developed by robotics company Boston Dynamics had a humanoid design that could walk and carry objects, much like humans do. To operate, Atlas relied on a cable that tethered it to a power supply.
But tethering robots like this limits how far they can move and interferes with the robots' mobility. So to let Atlas move around more freely, engineers installed a battery on the robot's structure so it could power itself. Unfortunately, the materials of a battery tend to be rather heavy, which posed its own challenges.
If we don't want robots like these toppling over all the time, it takes a bit of mechanical engineering know-how to work out where to position the battery with respect to the robot's center of mass. You might need some chemical engineering in the design of the battery, too. In the case of the Atlas, engineers gave it a hydraulic pump in its torso to help it support the extra weight of its lithium-ion battery. Now, only a few years later, it can do backflips and parkour.
More generally, some electrical engineering also has to go into wiring the signals from the computer program to the robot's physical parts. Which brings us to another common feature of robots: they have mechanical parts, like grips or wheels, for carrying out whatever physical tasks they need to in their environments. That might be turning a lever, picking up an object, or just moving somewhere else.
And there are some unique challenges to designing those parts. Consider a robot designed to pick fruit from trees. From an engineering perspective, it needs some fundamental qualities: the ability to recognize fruits and distinguish them from the rest of the plant, navigate its environment to move toward the fruits that need picking, and then to pick them and put them into a container.
Let's start with how it moves, a fairly basic requirement for lots of robots. Like with the mining robot, you could simply put some standard axles with wheels on the bottom of your robot and attach them to a motor, like on a car. That would be fine if the robot was going to be operating mostly on smooth, even surfaces like roads or factory floors. But most fruits are grown outdoors, sometimes in rough terrain and difficult environments. So you might need to design adjustable wheels that change height independently, or add treads to overcome small bumps, like on a tank. The problem with wheels is that they're not very good at overcoming large obstacles. If there's a fallen branch or a boulder blocking the way, the robot needs to be able to climb over it. Giving the robot legs could allow it to jump, but that has its own problems. Robots with legs tend to fall down a lot.
As the team at Boston Dynamics found when designing Atlas, programming a robot's computer to interpret its environment while handling the dynamics of all those mechanical parts is trickier than it seems. It really makes you appreciate what a good job your brain is doing.
Of course, to actually make sense of its environment and find fruits, the robot will need sensors, devices that measure physical characteristics and translate them into a signal. To find apples, for example, the robot might have an array of light-sensitive semiconductors, like the kind that make up light-capturing pixels in a digital camera, to scan an orchard. But the information sent by the camera sensor is interpreted by the computer as an array of colored pixels that don't mean an awful lot on their own. The average person can take one glance at a curvy shape of reddish pixels and instantly recognize it as an apple. For a computer, that requires a fairly sophisticated visual algorithm.
What's more, people are good at seeing where the edges of objects are and interpreting the relationship they have to their environment and how far away they are. You know, an apple that looks very small is most likely one that's very far away, but even a relatively smart computer that can recognize apples might not know whether it's a tiny apple only a few centimeters away or an enormous apple a few kilometers away. Which would affect whether the computer's programming tells it to pick the apple or not.
All of these issues are what are known as computer vision problems. Computer vision deals with how to train software to take the input data from images or video, like the kind that are delivered from digital cameras, and interpret it the way a human would. Even once it's found an apple and moved itself close to it, the fruit picker is going to need mechanical parts to actually pick the fruits.
The main mechanical parts used by most industrial robots are called actuators and effectors. Actuators are like robots' muscles—they convert stored energy into movement. One popular type are called electrical actuators, electrical motors that turn wheels or gears to robot the robot's connected parts with respect to one another.
These are the sorts of mechanisms that would extend the robot's arm toward or away from a particular branch. Linear actuators can achieve this by using a motor to extend the part up and down a thread like a nut on a bolt. They can also use compressed fluids like air or oil to extend a part outwards, then use a motor to compress the fluid and bring the part back when needed.
Effectors, meanwhile, are the parts that actually have an effect on the robot's environment—basically the robot's hands. To deal with the irregular shapes of different fruits, you could have what's called a vacuum grip that can suck up large objects and hold them in place. But most of the focus on building objects has been on mechanical effectors, the kind that rely on tactile feedback and manipulating. In other words, they give the robot an artificial sense of touch, perhaps with force-sensitive electrodes on the effector's surface.
Having a sense of feedback is important for applying the right amount of pressure. Otherwise, the apple might slip out of the robot's grip or be crushed into a pulp. To achieve this, the effector might be a simple two-part claw, or something more sophisticated with many parts, modeled on a human hand.
Picking fruit is the kind of job that robots could accomplish at scale much more easily than humans, freeing them to work on other aspects of farming. But robotics isn't just aimed at saving labor. Robots can also be used in environments that are far too dangerous to send humans into. Bomb disposal robots, which are actually more like drones, are operated by humans to find explosive devices and disarm them from a safe distance. In the future, fully automated robots might find uses in other harsh environments, like the deep sea and space.
But it's likely that the place robots will have the most impact won't be in the jobs they do instead of humans but the ones they do alongside them. Features of robotics are already making their way into healthcare, like in the development of prosthetic limbs. But in situations like surgery or disaster rescue operations, a combination of human smarts and purpose-built robotic strength could create safer, more efficient and totally new ways of doing things. So, like many engineering tools, robots will work best when they weave into our existing methods, working alongside us to accomplish our goals. Robots might be the future, but it's a far cry from The Terminator.
In this episode, we looked at robots and the engineering principles of robots. We learned how robots use sensors to interpret their environment, how actuators and effectors allow a robot to manipulate the objects around it to accomplish a task, and how computers coordinate the efforts of the two.
Crash Course Engineering is produced in association with PBS Digital Studios, which also produces It's Okay to Be Smart, a show all about our curious universe and the science that makes it possible, hosted by Dr. Joe Hanson. Check it out at the link in the description.
Crash Course is a Complexly production, and this episode was filmed in the Dr. Cheryl C. Kinney Studio with the help of these wonderful people. Our amazing graphics team is Thought Café.
Robots are still pretty far from having a human level capacity of intelligence, or even dexterity for that matter, but they're already a big part of many branches of engineering. In the US, over a hundred thousand robots have been added to factory floors since 2010. Knowing what exactly robots are, what they can do, and how they work is more important than you might think. Because soon, they might be living among us.
[Crash Course Engineering intro]
The classic picture of a robot is something with human-like intelligence, and maybe even a humanoid appearance. But like so much else, real-life robots aren't much like what you usually see on TV. Robots come in all shapes and sizes, and they can be built very differently depending on what they're used for.
For example, some of the robots used in mining are made from a camera mounted on a small chassis with wheels. That allows them to enter and inspect mine shafts and even retrieve leftover material from places it's hard to get people in and out of. Meanwhile, in medicine, specialized robotic arms, with a bit of human assistance, can perform precise surgeries through the tiniest incisions.
There's also a difference between robotics and artificial intelligence, or AI, which people sometimes confuse. While some of the concepts are similar, robotics deals with a specific set of ideas, although it does borrow a few from AI. AI deals broadly with the goal of automating decision-making for complex tasks—the kind you can't write a simple set of rules for. That could be everything from playing chess to driving cars. Right now, AI systems tend to have very narrow goals, but the holy grail of AI is to develop a system that can make intelligent decisions about any sort of task using different sources of information.
Our focus will be on robots, which are designed for more specific purposes in the physical world. In engineering terms, a robot is a machine designed to interact with its environment, make an appropriate decision based on those surroundings, and then carry out the jobs related to its goal, all automatically. Which means a true robot doesn't require a human controlling exactly what it does. In general, the field of robotics also deals with machines that do all of the same things, but might require a human operator to carry out some tasks.
Whether fully automatic or not, virtually all robots have a few features in common. First, robots are machines made from materials that occupy physical space, so they're more than just lines of computer code. That's one of the key features that distinguishes them from an AI, although computer engineering is important for robots as well.
Second, most robots have some way of sensing features of their environment, usually by measuring light, sound, or force feedback. They can then use that information to make a decision about what to do next, depending on what they're designed for.
The tasks robots are built for tend to be complex, requiring a sequence of different motions. So an automatic door that opens or shuts in response to whether you stand in front of a sensor wouldn't be considered a robot. The tasks that robots handle are more sophisticated, like welding two intricately shaped pieces of metal together.
Third, to interpret the signals they receive from their environment and coordinate some sort of response, robots have computers built somewhere into their design. These work just like any computer, taking inputs and delivering outputs. But unlike most ordinary software, the software on a robot's computer generates electrical signals that are directly passed on to the robot's hardware, instead of just changing information in a file or on a screen.
That also requires a power source. One option is to physically hook up to the power grid using wires and a socket. For example, the 2013 version of the Atlas robot developed by robotics company Boston Dynamics had a humanoid design that could walk and carry objects, much like humans do. To operate, Atlas relied on a cable that tethered it to a power supply.
But tethering robots like this limits how far they can move and interferes with the robots' mobility. So to let Atlas move around more freely, engineers installed a battery on the robot's structure so it could power itself. Unfortunately, the materials of a battery tend to be rather heavy, which posed its own challenges.
If we don't want robots like these toppling over all the time, it takes a bit of mechanical engineering know-how to work out where to position the battery with respect to the robot's center of mass. You might need some chemical engineering in the design of the battery, too. In the case of the Atlas, engineers gave it a hydraulic pump in its torso to help it support the extra weight of its lithium-ion battery. Now, only a few years later, it can do backflips and parkour.
More generally, some electrical engineering also has to go into wiring the signals from the computer program to the robot's physical parts. Which brings us to another common feature of robots: they have mechanical parts, like grips or wheels, for carrying out whatever physical tasks they need to in their environments. That might be turning a lever, picking up an object, or just moving somewhere else.
And there are some unique challenges to designing those parts. Consider a robot designed to pick fruit from trees. From an engineering perspective, it needs some fundamental qualities: the ability to recognize fruits and distinguish them from the rest of the plant, navigate its environment to move toward the fruits that need picking, and then to pick them and put them into a container.
Let's start with how it moves, a fairly basic requirement for lots of robots. Like with the mining robot, you could simply put some standard axles with wheels on the bottom of your robot and attach them to a motor, like on a car. That would be fine if the robot was going to be operating mostly on smooth, even surfaces like roads or factory floors. But most fruits are grown outdoors, sometimes in rough terrain and difficult environments. So you might need to design adjustable wheels that change height independently, or add treads to overcome small bumps, like on a tank. The problem with wheels is that they're not very good at overcoming large obstacles. If there's a fallen branch or a boulder blocking the way, the robot needs to be able to climb over it. Giving the robot legs could allow it to jump, but that has its own problems. Robots with legs tend to fall down a lot.
As the team at Boston Dynamics found when designing Atlas, programming a robot's computer to interpret its environment while handling the dynamics of all those mechanical parts is trickier than it seems. It really makes you appreciate what a good job your brain is doing.
Of course, to actually make sense of its environment and find fruits, the robot will need sensors, devices that measure physical characteristics and translate them into a signal. To find apples, for example, the robot might have an array of light-sensitive semiconductors, like the kind that make up light-capturing pixels in a digital camera, to scan an orchard. But the information sent by the camera sensor is interpreted by the computer as an array of colored pixels that don't mean an awful lot on their own. The average person can take one glance at a curvy shape of reddish pixels and instantly recognize it as an apple. For a computer, that requires a fairly sophisticated visual algorithm.
What's more, people are good at seeing where the edges of objects are and interpreting the relationship they have to their environment and how far away they are. You know, an apple that looks very small is most likely one that's very far away, but even a relatively smart computer that can recognize apples might not know whether it's a tiny apple only a few centimeters away or an enormous apple a few kilometers away. Which would affect whether the computer's programming tells it to pick the apple or not.
All of these issues are what are known as computer vision problems. Computer vision deals with how to train software to take the input data from images or video, like the kind that are delivered from digital cameras, and interpret it the way a human would. Even once it's found an apple and moved itself close to it, the fruit picker is going to need mechanical parts to actually pick the fruits.
The main mechanical parts used by most industrial robots are called actuators and effectors. Actuators are like robots' muscles—they convert stored energy into movement. One popular type are called electrical actuators, electrical motors that turn wheels or gears to robot the robot's connected parts with respect to one another.
These are the sorts of mechanisms that would extend the robot's arm toward or away from a particular branch. Linear actuators can achieve this by using a motor to extend the part up and down a thread like a nut on a bolt. They can also use compressed fluids like air or oil to extend a part outwards, then use a motor to compress the fluid and bring the part back when needed.
Effectors, meanwhile, are the parts that actually have an effect on the robot's environment—basically the robot's hands. To deal with the irregular shapes of different fruits, you could have what's called a vacuum grip that can suck up large objects and hold them in place. But most of the focus on building objects has been on mechanical effectors, the kind that rely on tactile feedback and manipulating. In other words, they give the robot an artificial sense of touch, perhaps with force-sensitive electrodes on the effector's surface.
Having a sense of feedback is important for applying the right amount of pressure. Otherwise, the apple might slip out of the robot's grip or be crushed into a pulp. To achieve this, the effector might be a simple two-part claw, or something more sophisticated with many parts, modeled on a human hand.
Picking fruit is the kind of job that robots could accomplish at scale much more easily than humans, freeing them to work on other aspects of farming. But robotics isn't just aimed at saving labor. Robots can also be used in environments that are far too dangerous to send humans into. Bomb disposal robots, which are actually more like drones, are operated by humans to find explosive devices and disarm them from a safe distance. In the future, fully automated robots might find uses in other harsh environments, like the deep sea and space.
But it's likely that the place robots will have the most impact won't be in the jobs they do instead of humans but the ones they do alongside them. Features of robotics are already making their way into healthcare, like in the development of prosthetic limbs. But in situations like surgery or disaster rescue operations, a combination of human smarts and purpose-built robotic strength could create safer, more efficient and totally new ways of doing things. So, like many engineering tools, robots will work best when they weave into our existing methods, working alongside us to accomplish our goals. Robots might be the future, but it's a far cry from The Terminator.
In this episode, we looked at robots and the engineering principles of robots. We learned how robots use sensors to interpret their environment, how actuators and effectors allow a robot to manipulate the objects around it to accomplish a task, and how computers coordinate the efforts of the two.
Crash Course Engineering is produced in association with PBS Digital Studios, which also produces It's Okay to Be Smart, a show all about our curious universe and the science that makes it possible, hosted by Dr. Joe Hanson. Check it out at the link in the description.
Crash Course is a Complexly production, and this episode was filmed in the Dr. Cheryl C. Kinney Studio with the help of these wonderful people. Our amazing graphics team is Thought Café.