This is mostly a fun post, though I hope it may trigger some thinking. Since there is much hype about self driving cars, I decided to express myself artistically and draw a bunch of situations that the current technology most likely would not be able to deal with. Some of them might be funny, some might be dangerous. Answer yourself if you want to be driven by an entity which cannot understand these situations and if you are developing a self driving car, let me know in the comments if such situations could indeed be misinterpreted. I'll start with just a few, it takes time to draw them and I'm not a very good artist. Email me if you have ideas for additional situations.
Artwork 1: Open manhole. Unlike with a regular pothole, driving into open manhole can lead to a disaster. Would a driverless car figure that out? If you consider an open manhole without construction cones an unlikely possibility, take a look at this video.
Artwork 2: Stop sign prank. Somebody (bored teenagers?) put tens of stop signs in a middle of the desert road. Will a driverless car stop by each one of them?
Artwork 3: Fire in a tunnel. An autonomous car approaches a tunnel, but there is something horribly wrong, black smoke is coming out. However, no obstacle registers on the lidar and the road seems clear. The suffocating smoke enters the cabin... If you question if a similar situation can occur, check this video.
Artwork 4: Tornado is crossing the road. Car sensors register high winds, similar to a desert sandstorm, there is however a subtle difference. The car slows down but proceeds. If you consider this scenario to be extremely unlikely check this video as well as many others on youtube.
Artwork 5: What are these guys doing in the middle of nowhere? Are they real soldiers or police? Or are these bad guys? Maybe it is an ambush? The answer may depend on the social/political context. Should the car turn back? Should the car stop?
Artwork 6: The stop sign detached from the school bus and is swinging on the wind in opened position. It is clearly a stop sign of the right size and shape but the context (75 mph on a multi lane freeway) is clearly wrong. What to do??
Exhibit 7. This could not be drawn but instead I've taken a short video. The bright lanes visible on the street in these illumination conditions are actually not the real lanes. These are remnants of a long forgotten construction, overpainted with black paint. However close to noon with sunlight illumination these black marks reflect light very strongly and appear much more visible than real, barely visible in this context lanes. Although this would potentially confuse people as well, vast majority of the drivers follow the true, barely visible lane, since they likely remember which one is which. Would a self driving car have such ability to memorise thing on the fly? Or would it be confused every time in this situation?
Artwork 8: The spider made his web in a wrong place. Although in this particular example the situation could be salvaged by using input from multiple cameras, it highlights an important point: a device cannot act if one of the sensors is obstructed. Humans easily clean their windshields, use sun shades, move their head when they cannot identify an object. In the extreme case they can get out of the car to inspect the situation directly. A robotic car, though having superior sensors, cannot act on them like humans do and can only rely passively on what those sensors report.
Artwork 9: An avalanche is rolling down the hillside. The car is completely clueless there is any danger, and so it firmly proceeds directly into the danger zone...
The nasty thing about reality is that such situations occur, but occur too rarely to be statistically significant (these are sometimes referred to as long tail events). The exact context of these events also varies enormously. Once you patch one hole, it turns out many others open up. You patch the stop sign prank, it turns out the system is susceptible to the yield sign prank. You patch this, something else surfaces. And as the case with cybersecurity teaches us, as soon as such a susceptibility is evident, there will be people taking advantage of it (more intelligent agents exploiting less intelligent agents). It will be a long time before the self driving car will be on par with humans when it comes to intelligence; until then it may be subject to exploitation by various more or less rouge adversaries (aside from just dealing with the complexity of the world).
These pictures are just meant to serve as a mental exercise to those who either build self driving cars or those who want to use them. Clearly not the usual road conditions, but at the same time not so unusual to be unthinkable. In fact, I've encountered in my driving life at least one such situation. It was not dangerous, but I would have easily been dead by now if I had acted foolishly on it. Would you have that trust in your self driving car?
So would I want an autonomous car? For now, I'd love to have a car which will protect me from the stupid things I could do. It will monitor my surrounding and alert me about obstacles. It may even engage the brakes (or better yet, dampen the accelerator) when for some reason I don't respond. Takes me back when I veer from a lane (if the lane is clearly visible). Monitors if I pay attention and warns me when I get too tired. Helps me park in a tight spot (or even self parks under my supervision). Monitors if someone is holding their fingers in a dangerous place next to the doors (before they close). Warns me of people or animals about to cross the road. Does things mostly in passive way, increasing the safety, leaving the high level decisions to the driver and always allowing for an override. This is a great copilot and many of these features are already available. But not an autopilot. Not yet.
Disclaimer: by autonomous I mean level 5 autonomy, so the passenger does not have any control, possibly not even any awareness of the situation (e.g. lack of transparent windshield). Human inside is entirely at the mercy of the AI driving the car. Level 4 autonomy obviously allows to take over in unusual conditions, but carries its own problems. The question is if the driver would even pay enough attention and be able to act when such situations occur (some of the recent accidents with Tesla autopilot indicate that likely not).