YouTube: https://youtube.com/watch?v=9vGYSE4fVhQ
Previous: The Psychology of Trolling
Next: Why Isn't a Kilogram a Kilogram?

Categories

Statistics

View count:435,507
Likes:13,749
Comments:1,703
Duration:05:20
Uploaded:2016-07-08
Last sync:2024-02-13 16:30

Citation

Citation formatting is not guaranteed to be accurate.
MLA Full: "Are Self-Driving Cars Safe?" YouTube, uploaded by SciShow, 8 July 2016, www.youtube.com/watch?v=9vGYSE4fVhQ.
MLA Inline: (SciShow, 2016)
APA Full: SciShow. (2016, July 8). Are Self-Driving Cars Safe? [Video]. YouTube. https://youtube.com/watch?v=9vGYSE4fVhQ
APA Inline: (SciShow, 2016)
Chicago Full: SciShow, "Are Self-Driving Cars Safe?", July 8, 2016, YouTube, 05:20,
https://youtube.com/watch?v=9vGYSE4fVhQ.
Tesla's Autopilot system is the most advanced available right now, but it has limitations, and some of those limitations might be us.

Hosted by: Hank Green
----------
Support SciShow by becoming a patron on Patreon: https://www.patreon.com/scishow
----------
Dooblydoo thanks go to the following Patreon supporters -- we couldn't make SciShow without them! Shout out to Kathy & Tim Philip, Kevin Bealer, Andreas Heydeck, Thomas J., Accalia Elementia, Will and Sonja Marple. James Harshaw, Justin Lentz, Chris Peters, Bader AlGhamdi, Benny, Tim Curwick, Philippe von Bergen, Patrick Merrithew, Fatima Iqbal, Mark Terrio-Cameron, Patrick D. Ashmore, and charles george.
----------
Like SciShow? Want to help support us, and also get things to put on your walls, cover your torso and hold your liquids? Check out our awesome products over at DFTBA Records: http://dftba.com/scishow
----------
Looking for SciShow elsewhere on the internet?
Facebook: http://www.facebook.com/scishow
Twitter: http://www.twitter.com/scishow
Tumblr: http://scishow.tumblr.com
Instagram: http://instagram.com/thescishow
----------
Sources:
https://www.teslamotors.com/blog/tragic-loss
http://www.techinsider.io/how-teslas-autopilot-works-2016-7
http://www.nytimes.com/2016/07/02/business/international/bmw-tesla-self-driving-car-mobileye-intel.html
http://www.nytimes.com/2016/07/04/your-money/as-self-driving-cars-hit-the-road-innovation-is-outpacing-insurance.html
http://www.nytimes.com/2016/07/02/business/joshua-brown-technology-enthusiast-tested-the-limits-of-his-tesla.html
http://www.nytimes.com/2016/07/02/business/a-fatality-forces-tesla-to-confront-its-limits.html
http://www.nytimes.com/2016/07/01/business/self-driving-tesla-fatal-crash-investigation.html
http://fortune.com/2016/07/03/teslas-fatal-crash-implications/
http://fortune.com/2016/06/30/regulators-examine-a-fatal-crash-with-teslas-autopilot/
http://www.slate.com/blogs/moneybox/2016/07/01/tesla_autopilot_crash_victim_joshua_brown_was_watching_a_movie_when_he_died.html
http://gizmodo.com/it-looks-like-the-tesla-driver-was-watching-harry-potte-1782980532
https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter
http://www.usatoday.com/story/tech/nation-now/2016/07/01/navy-seal-vet-killed-using-teslas-autopilot-posted-close-call-video-month-ago/86592458/

Images
https://www.teslamotors.com/about
Thinkstock.com
[SciShow intro plays]

Last week, a federal agency in the US announced that they were investigating a death that was the first of its kind: A driver was killed while his car was doing most of the driving for him -- in this case, a Tesla Model S with its Autopilot mode enabled.

The fact that there haven’t been any fatalities until now is a testament to the technology behind self-driving cars. But this crash is a reminder that this technology has its limitations. Investigators from the National Highway Traffic Safety Administration are still piecing together the details of the accident, but they do know the basics: The driver, Joshua Brown, had his car’s Autopilot mode activated while driving down a highway in Florida, when a tractor-trailer made a left turn in front of the car. The car didn’t stop -- it went under the trailer, then hit a fence and a power pole. Brown was killed, and Autopilot didn’t save him.

When humans drive cars, we’re following the road, keeping track of people, bikes, and other cars, and looking out for any sudden changes that might mean we have to swerve or stop to avoid an accident. Tesla’s Autopilot feature, which is mainly meant for highway driving, is the most advanced computer-controlled driving system available to consumers right now. Like other autonomous driving systems, it combines a bunch of different pieces of technology to let the car collect and process information kinda like a human would, and drive itself. A GPS system gives the car information about things like where it is and the speed limit.

A camera on the front of the rear-view mirror and a radar sensor on the front grille can scan about 160 meters of road ahead of the car. There are also 12 ultrasonic sensors that send out pulses and measure how they echo -- kinda like how bats and dolphins navigate -- to map out the 5 meters around the whole car. The car’s computer sorts through all the data from the camera and sensors, keeping track of the information it needs to navigate the road -- like lane markers and roadside barriers -- and the movement patterns of other vehicles.

Then, it maneuvers the car down the road, watching out for any sudden changes nearby. If a truck in front of the car stops suddenly, for example, the computer will stop the car, as well. Computers do have advantages over humans when it comes to this stuff: they can constantly monitor every direction without getting distracted or tired, and their response times can be much faster than our reflexes.

But people have our advantages too. We have really good sensory perception and decision-making skills that are far more advanced than any computer program. According to Tesla, this is the first fatality in over 200 million kilometers that users have driven with Autopilot mode enabled, while with human drivers, on average there’s a fatal car accident for every 150 million kilometers driven.

Now, the scientist in me has to say that this isn’t a valid comparison. Teslas are very safe cars with advanced safety systems, so they’re bound to have fewer deaths per passenger mile with Autopilot on or off. The average car on American roads is more than 10 years old and so are its safety features. And, of course, a sample size of one is pretty statistically useless.

Even so, it’s clear that even at this very early stage computer-driven cars seem pretty good at their jobs. But self-driving car systems, of course, aren’t anywhere close to perfect and Tesla is open about Autopilot’s limitations. They say that Autopilot is in a public beta test, meaning that the feature is still experimental and users are helping catch any bugs or other problems. The driver is supposed to have their hands on the steering wheel at all times, ready to take control of the car if need be. If the system detects that a driver doesn’t have their hands on the wheel, it’ll beep and show multiple warnings, and eventually stop the car. Having an alert driver is really important, because there are a lot of conditions where Autopilot can’t work safely.

If it’s snowing, for example, the car’s camera might not be able to see the lane markings on the road. Even driving up a hill can block the camera’s view. In those cases, where the computer realizes that it can’t accurately keep track of everything around it, it’ll turn Autopilot off and tell the driver to take control of the car.

Other times, the computer just doesn’t react to sudden changes on the road the way that it should. That’s what happened during the crash that killed Brown. When the tractor-trailer -- which was white -- turned left in front of the car, the computer couldn’t see it against the bright sky, so it didn’t hit the brakes. And neither did Brown.

One of the chief concerns with this Autopilot isn’t technological, it’s psychological. If you believe your Autopilot system is going to save you, you might not worry so much about being distracted. An aftermarket DVD player was found in Brown’s Tesla. Whether he was watching it at the time of the crash is unknown, but that is the kind of behavior that an Autopilot system might encourage whether or not it’s expressly forbidden.

More research needs to be done to make these systems better, but also on how they affect driver behavior, and how to ensure that drivers are using them properly. Either way, Brown’s death wasn’t Autopilot’s fault. The system didn’t cause an accident by driving dangerously -- it (and the driver) both failed to detect a sudden change and prevent a crash.

In some ways, software has, for years been a matter of life and death, but never so much as with self-driving cars. Yes, they have a good track record and the technology will keep improving. But their human driving partners are a necessary part of the safety of these systems, and that’s going to be the case for quite a long time.

Thanks for watching this episode of SciShow News, and thanks especially to SR Foxley, our President of Space. If you want to help us keep making videos like this, and maybe be a President of Space yourself, go to patreon.com/scishow­, and don’t forget to go to youtube.com/scishow and subscribe!