Uber self-driving car on its side, March 25, 2017, Arizona
In the last week I’ve encountered half a dozen situations difficult for me, a human, to figure out and drive through. Too bad they require my full attention and I cannot snap photos through the windshield.
In one, signals were flashing on a 10 lane road (3 thru lanes and 2 turn lanes on each side, NASA Parkway, Houston, TX). Usually cars treat this as a 4-way stop, complicated enough with so many lanes. But a policeman was dragging a fold-up stop sign into the road. He was several lanes away. Wait for him? Oh, there is another policeman farther away, also walking. He makes some vague gesture. I think he is telling the people to my right to go. He gestures again and looks straight at the group of cars I’m in. OK, I cautiously go.
What would Uber do? Probably treat it as a 4-way stop. But how would it figure out whose turn it was? It is extremely difficult in multi-lane intersections with turn lanes. Would it run over the policeman (either of them) because they “failed to yield” as the other driver did in the wreck pictured above? Most of my attention as I drive down the road is not on normal expected behavior, but is spent estimating whether approaching cars either on side roads or in other lanes are really going to yield as they should. This is a “dance” that people do. One develops a feel for it, which varies with culture and even by city or neighborhood.
Last night driving on Mississippi 17 between Vicksburg and Yazoo City, I saw blue blinking lights ahead in the dark. I slowed down, thinking I should be prepared to pass in a different lane, but with no different lane available, perhaps go very slow. My estimate of how slow kept decreasing. I can’t imagine it being set by an algorithm. When I got there, a wrecker was parked at an angle in the road, extending into my lane. Apparently someone was off in the field, their car wheels so mired in the mud the frame was dragging. I didn’t laugh. I’ve been there myself. On the other side some man was waving a little light back and forth. Eventually I assumed he meant for me to go, as no one appeared to be coming from the other direction. He didn’t shout or change what he was doing, so that must have been the correct thing to do. I might have had to go off the pavement a little bit.
Frankel-ly, Gor-such a case got into the news last week. It seems that even Supreme Court nominees cannot decide what to do in common road situations without engaging in the “absurd.” The subject was Gorsuch’s dissenting opinion on a Minnesota truck driver whose company left him in 14 below weather to freeze. After many hours, rather than die, he drove a trailer with locked breaks onto the highway.
Suppose you are in an Uber or Google or Apple or whatever, and it is 14 below, and the damn thing won’t go because some warning light is flashing, or maybe it simply cannot see the lane markers?
Even if it can see lane markers and cars, which might be 90% of freeway driving, it’s only, what 5% of residential driving? 0% of special situations? Of course the easy part can be automated. I heard a paper on automated driving at Oklahoma University in 1968. Since then computers got smaller and cheaper and faster and sensors have improved, but I’m not aware of any progress on understanding special situations.
Frankly, the full range of human judgment is required in such situations, and even a judicial automaton is not adequate. I don’t want Gorsuch for judge in the SCOTUS (where ALL the cases are difficult), nor do I want self-driving cars – until they can fully pass written and driving tests as humans do, under all conditions.