« Previously: Knowing the limitations of Machine Learning  

Beyond machine learning, another hole that Carnegie Mellon University professor Philip Koopman has found in the Federal Autonomous Vehicle Policy is the DoT’s spineless insistence on the “independence” of safety assessment.

The current language in Fed policy reads: “Testing may be performed by manufacturers and suppliers but could also be performed by an independent third party.”

This sentence has no teeth, in the eyes of safety experts. Koopman emphasised that in safety critical system domains, “The requirement for independence is mandatory, not optional.”

He pointed out, “In practice, independence of validation is essential.”

In contrast to other industries, automakers have been coddled. They’re permitted to operate under the privileged framework of “self-certification.” Recent events like Toyota’s sudden unintended acceleration cases, General Motors’ ignition switch recalls and the massive Takata airbag recalls illustrate the abject failure on the part of carmakers to self-police safety.

Koopman said, “It is high time that the automotive industry joined the rest of the world in being required to have a transparent method of independent safety assurance.”

The way to do independent certification is not limited to giving the task to the federal government or requiring automakers hiring an independent third-party certification company like Exida or TÜV, he said.

Consider, for example, the aerospace industry, Koopman said. The Federal Aviation Agency allows “designated engineering representatives (DERs)” to do safety inspections. These engineers are paid by an airplane company like Boeing or Airbus but their ethical allegiance is pledged to the FAA, Koopman explained.

“Personally, I don’t think the government should stick its nose in every line of code,” said Koopman. “But there is an existing model like hiring consultants or DERs to do the independent safety assurance.” Koopman hopes to see the automotive industry build a much more solid safety culture.

More articulation needed in the policy

In responding to the DoT’s proposed automated vehicle policy, Koopman offered his comments—beyond ML and independent testing –in seven additional areas where he believes the document can be improved.

  • Koopman, taking issue with some language in the policy document, wants the DoT to articulate the followings:
  • Regardless whether minor or major, any software updated should trigger a safety reassessment
  • A strategy for when the “fall-back” triggering system itself goes brain-dead
  • A definition of methodology for determining “reasonable” exceptional scenarios
  • Ensuring the integrity of self-reported crash data (Can you trust your event data recorder?)
  • A vehicle diagnostic system (failure detection) that accounts for the component’s end of life issue
  • NHTSA funding for the development of a taxonomy of traffic rule “exceptions” (Every city has different traffic rule exceptions, like the notorious “Pittsburgh Left”) Safety assurance in driver takeover strategies

Followings are some examples Koopman gave us in explaining changes in language he hopes to see in each of these items.

 
Next: Time-tested safety approaches and the Pittsburgh left »