Can a robotaxi be prepared to deal with all kinds of passengers - adults, kids, the elderly, handicapped, men behaving badly and obnoxious customers with unreasonable demands?
Unexpected uses in use-case scenarios and counter-intuitive interfaces in man-machine interfaces often lead to the premature death of new products.
It’s not unusual for system designers to release their products into the wild, only to have users play with a product in ways that were never intended, resulting in accidents. Sometimes consumers send thousands of the new product back to manufacturers simply because they couldn’t figure out how to use it.
Predicting the future is hard. It’s easy to see where things went wrong after they go wrong, but 20/20 hindsight is rarely a comfort.
Call it a “foresight gap.” The engineering community can be blindsided when it did not anticipate how the completed system will be used and what could happen once it’s installed in the real world.
The person who reminded me of this lesson — and summed it up succinctly — is Phil Koopman, CTO of Edge Case Research and professor at Carnegie Mellon University. He makes his case simply by asking one question: #DidYouThinkOfThat?
Recently, Koopman wrote a blog, Car Drivers Do More than Drive. In his post, he asked: “How will self-driving cars handle all the non-driving tasks that drivers also perform?” More specifically, he asked: “How will [self-driving cars] make unaccompanied kids stop sticking their heads out the window?”
Koopman is neither a Luddite nor a naysayer on autonomous vehicles (AV). If anything, he is an avid advocate for safe AVs.
Take a school bus for example. No one who has ever been on a school bus really believes that loading one with kids, unsupervised by an adult, is such a great idea. The movie version would be called “Kids Going Wild.”
Yet, at the Geneva Motor Show in 2018, the Volkswagen Group unveiled its own self-driving “school bus” concept, called SEDERIC (SElf DRiving Car). Maybe German kids behave differently from all other kids? Seriously, what were they thinking?
This sort of engineering naiveté is the tip of the iceberg. There are many AV use cases that can simply go horribly wrong if they aren’t thought through.
A robotaxi, for instance, must be prepared to deal with all kinds of passengers – adults, kids, the elderly and handicapped, men behaving badly, obnoxious customers with unreasonable demands, or just people who get suddenly sick in the back seat.
Further, self-driving cars, with no human operator, must be able to communicate with the external world. How do pedestrians, unable to make eye contact with a driver, discern the robocar’s next move? Will the robocar be able to understand, let alone hear, what a human driver in the next lane is yelling about?
I asked Koopman to lay out specific self-driving use-case scenarios for which AV system designers should be asked the fundamental question: “Did you think of that?”
In our interview, Koopman limited his scope to AV/human interaction. “To be clear, not every car needs to handle all of these situations. But at least some self-driving cars will encounter at least some,” he added.
So, the question is whether designers have anticipated how to handle these. In other words: #DidYouThinkOfThat?
Koopman first laid out certain complications that could arise from basic passenger-robotaxi interaction.
Setting aside basic operations, the first #DidYouThinkOfThat moment hit me in 2016 when I was interviewing retired pilot Richard Hartman. I was asking him about differences between flying and driving.
He told me then that pilots usually spend a lot of time on flight plans before taking off. Once a plan is loaded into a computer, another pilot double-checks the information. One pilot reads it aloud, the other confirms. Then, the air traffic control system sets course. If rough weather appears on route, controllers might alter the course away from the more direct path charted by GPS.
But when a person drives a vehicle, Hartman said, the trip is rarely so orderly. Assume an ultimate destination — home. On route, you might stop for coffee, meet a friend, pick up groceries. “Is it likely people will spend time pre-planning and doing data entry to chart the course?” Probably not. Besides, human drivers change their minds all the time along the way.
The most likely scenario in the real world is that people, more often than not, will “disengage the autopilot mode,” Hartman predicted.
This conversation took place well before the “robotaxi” became popular brainstorm.
Even today, Koopman believes, “how do you select a destination” is the basic system requirement for AV designers. That basic operation — destination choice — must be flexible enough to accommodate the passenger’s change of mind. People should be able to change their plans after they ordered the cab.
Koopman is particularly concerned by emergency situations in which passengers might issue unexpected orders. What does an AV do?
The emergency orders posed above assume that AVs can comprehend passengers’ verbal input. But do AVs really get me? “Just think of all the Siri and other voice command system malfunction stories,” said Koopman, “…except now it’s controlling a car.”
Koopman said, “what about mistaken verbal commands to the vehicle due to misunderstanding a human, misinterpreting music being played by a passenger as a command, advertising hack commercials that order the vehicle to take the occupants for drive-through food at a specific restaurant chain, or even the odd parrot that has learned a problematic vocabulary?”
OK. I’m not sure about the back seat-driving parrot scenario, but the drift is clear. Nobody can be sure that voice control technology can ever operate at 100 percent accuracy. People don’t do nearly that well. AV must be taught to ask, “Say what?”
We’ve all heard this question: “How do pedestrians know that the car sees them in a crosswalk? Acknowledging that this has been discussed before, Koopman cautioned: “Where things get a lot more complicated are at the next level down.” Consider for example:
Koopman added, “We'll have to think about what expectations there are of passengers.” Even for a completely self-driving vehicle, he asked, “Will there need to be a ‘rider's license’ to ensure some form of adult supervision is present inside any vehicle?”
Consider, for example:
Designers should never underestimate consumers’ ability to misuse a new product. Koopman asked:
In collaboration with Edge Case Research Labs, Underwriters Laboratories is currently developing a UL Standard, dubbed UL 4600, to cover autonomous product safety. Koopman told us that a lot of UL 4600 content will include the laundry list above. And a lot more.