In a verbose blog piece published July 24, GM Cruise CEO Dan Ammann finally surrendered the goal of launching a commercial robotaxi service in 2019. The online reaction can best be summed up using a line which my wife says to me frequently: “There are as yet undiscovered tribes who knew you were going to say that!”

Ammann buried his lead, and softened the news with the most innocuous phrasing possible: “In order to reach the level of performance and safety validation required to deploy a fully driverless service in San Francisco, we will be significantly increasing our testing and validation miles over the balance of this year, which has the effect of carrying the timing of fully driverless deployment beyond the end of the year.”

Translation: Cruise isn't giving up on robotaxis — it's just a delay.

That this plan barely differs from the oft-cited definition of insanity — doing the same thing over and over and expecting a different outcome — probably won’t occur to Ammann until GM, under pressure from investors, says 'stop' and the money runs out.

2019 has turned into an annus horribilis for the robotaxi industry, with the Cruise announcement coming less than a week after publication of the NTSB report into the 2017 Navya shuttle crash in Las Vegas — the details of which are summarized nicely by Edward Niedermeyer in an article titled "NAVYA Shuttle Incidents Show Risks Of Even A Low-Speed Rush To Autonomy."

Outside of Silicon Valley, everyone knows crazy stuff happens all the time on roads. Unpredictability isn’t an “edge case” that can be fixed with AI; it's the default setting for the real world. In the Las Vegas incident, the collision could have been avoided had the shuttle simply backed up — a maneuver any experienced human driver would have executed instinctively.

What these and other “self-driving” events prove can be summarized as follows:

  •        Machines are exceptional at repetitive, highly predictable and clearly defined rules-based tasks. Humans are completely hopeless at staying focused in such an environment.
  •        Humans are excellent at using initiative, improvisation and going “off-script”, where necessary. Machines — yes, even those with Nvidia processors — aren’t.

Roads are complex systems and everyone who uses them — pedestrians, cyclists and drivers — will have first-hand experience of complexity theory and the pattern of order to disorder and back to order again.  Complex systems thus pose substantial challenges to the design and validation of machines which work best with predictability.

It is complexity theory that currently blocks the path of successfully removing the safety driver from “self-driving” robotaxis altogether. If the tech industry had taken the time to read Simply Complexity by Neil Johnson, or understood that driving the same streets over and over (à la Cruise and Uber) proves nothing about safety, they could have saved themselves a few dollars — and probably the life of Elaine Herzberg, the casualty of the Las Vegas incident.

While most humans drive erratically some of the time, the human body itself is as predictable as my jokes are to my wife. The study of the body’s psychological and physiological patterns of behavior is called Human Factors and dedicated research has been underway for several years to understand distraction and fatigue as it relates to human driving.

This work will begin to be seen in series-production vehicles starting next year, in the form of advanced driver monitoring systems (DMS) from Seeing Machines — whose technology is best known in GM Super Cruise in the Cadillac CT6. You can expect to see DMS installed on many mass-market vehicles over the period 2021 to 2025, with other suppliers including Eyesight, Jungo Connectivity and Smart Eye.

The safety benefits of DMS have already attracted the attention of lawmakers, with legislation coming through the European Parliament for the mandatory installation of this technology starting for all new models sold in Europe from May 2022. Euro NCAP’s 2025 roadmap also specifies the use of IR vision-based DMS as a primary safety system to achieve a 5-star rating starting in 2022. Expect other regional NCAP bodies to follow Europe’s lead, and very probably IIHS in the US, too.

Since 2014, Tesla has terrified the conventional auto OEMs with its aggressive deployment of untested and unproven automated driving technology, such as Autopilot. Tesla’s strategy reveals a company with no clear idea if it is an auto OEM or a tech company, and with that lack of clarity has ended up doing both badly.

For the last three years, GM has had both Cruise and Super Cruise technology at its disposal. In an obvious attempt to “out-tech” Tesla and to keep up with Waymo, GM focused its resources onto Cruise. One can only speculate how the future of GM would look now had it instead invested its resources into deploying a version of Super Cruise onto every model it produces worldwide.

The future of driving is a human/machine collaboration and for the auto industry, the goal is simply to use technology to make human drivers into safer drivers. If GM's objective really is road safety, its words and its actions are incongruent. GM is not Tesla. So, over to you Mary Barra, it is time for the General to stand at the front and lead.