Regular readers by now may have gathered that I'm skeptical about the current self driving car hype. To make things clear: this is not because I would not like to use a driverless car, or that because I think it is fundamentally impossible. My skepticism is merely caused by my concern that the technology we have right not is not mature enough for such application. That includes both the fundamental technological primitives in the space of AI as well as economic feasibility. Also the increasing hype and sensational press reports are not improving the realistic and fact based discussion that should take place.
The argument that is often repeated in popular press and used by the proponents of autonomous cars is that they will be much safer than humans. This argument is very potent and emotional, as nearly each one of us had a relative killed in a car accident and the number of these accidents is still too high (even though in absolute terms motor vehicle related fatalities are very rare). I would certainly like to see the improved safety by whatever means, the lowest hanging fruits in this space I think are: better training and testing of drivers (current situation with DMV is a joke), improving infrastructure, more education about the effects of distracted driving, enhancing and promoting of mass transportation etc.
Among these measures, there certainly is a place for driverless cars one day, but as of today there is no evidence they would actually be safer. Let's get to the numbers.
According to the source here http://www.iihs.org/iihs/topics/t/general-statistics/fatalityfacts/overview-of-fatality-facts an average rate for a motor vehicle related fatality is approximately 1.12 per 100 million driven miles in the USA. Approximately one third of all killed were alcohol intoxicated (BAC>0.08%), approximately half of the killed were unbelted. If we exclude those, the figure goes roughly to 1 fatality/200 mil miles. In California the total deaths per 100 mil miles is 0.95 to begin with, so slightly below the USA average, likely due to good weather, infrastructure and relatively modern fleet of cars. So generally we are somewhere in the 1 death per 100-200 mil miles driven ballpark. The link above contains many other breakdowns and is a fascinating read.
Now for the test vehicles commonly called "self driving cars". The detailed disengagement report for the state of California is available here https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disengagement_report_2016 .
Roughly what figures are we seeing:
Company | Miles per disengagement |
Waymo/Google | 5128* |
BMW | 638 |
Nissan | 246 |
Ford | 196 |
Delphi | 17.6 |
Note that Tesla, which is making great fuss about their misleadingly called ADAS system, is not even in this table as they have only driven 550 miles with a disengagement approximately every 3 miles. Now it is important to note that the definition of a "disengagement event" may vary between companies. Most companies report every case in which a human grabs the wheel for any reason. Waymo (*) only reports the events, in which if not for the human intervention the car would actually cause a dangerous situation [read more here]. The way they do it, is for every physical disengagement they gather all the sensor data and next simulate multiple scenarios. If these scenarios lead to a dangerous situation, such event is being reported. According to Waymo in 2016 nine events would have lead to the car hitting an obstacle or another road user, approximately 1/10 of all disengagements they've reported (124). Hence there is such a gap between Waymo and the rest of the pack.
Nevertheless even if we take the number in which Waymo admits there would have been a crash, we get a crash every 635868/9 = 70652 miles. If we conservatively assume that every 100th disengagement for the rest of the pack would have lead to an accident, we get between 1760-63800 miles driven before a crash. If moreover equally conservatively we assume that 1/10 of these crashes would have lead to a fatality, we get a death in between 17600-706520 miles. Admittedly there are many assumptions here, but I try to be actually very favorable for the self driving cars. Hence we are off between two to four orders of magnitude in terms of safety as compared to humans. In this context the often repeated statement that "humans are such bad drivers" seems like a joke.
So to reiterate: current data indicates that self driving vehicles are somewhere between 100x to 10000x less safe than humans in driving as measured by fatality per miles driven. And this is the data for California, where the weather conditions are either perfect or close to perfect, few if any of these tests were done in snow, heavy rain or a dust storm. To grasp some perspective, getting into an autonomous car today (if one existed in deployment) would likely be far less safe than getting a ride with a drunk teenager who is texting every 5 minutes...
Does this mean the self driving cars will never be safer than humans? I'm not saying that, one day they may. But as of today (mid 2017), the data indicates they are very much less safe. Now the tricky part is to get over these missing 2-4 orders of magnitude in safety numbers, and as any engineer will tell you, this is the really difficult part [also read the related blog post here and here]. The low hanging fruits are now gone and to improve those numbers further the technology will have to deal with progressively more complex scenarios and conditions. Do we currently have a technology that could do it? Many will argue that deep learning is such a silver bullet, but I personally don't agree, not in the current form at least. Although the general avenue of having a system that learns rather than preprogramming everything is largely correct, the current models are trying to memorize case by case and are not trained to reason about the new, unseen situations (see my post here, here and here for more detailed discussion). I'm working slowly on applying Predictive Vision Model (PVM) to scene understanding in the automotive domain, and I think in the long run it may address many problems I foresee. Nevertheless, even if self driving cars are coming, they will not get here anytime soon.
If you found an error, highlight it and press Shift + Enter or click here to inform us.