A growing body of evidence indicates that electronics, aerospace and other technology manufacturers under unrelenting competitive pressure are placing time-to-market considerations ahead of reliability — including mission-critical systems. In the stunning case of the Boeing 737 Max, evidence is mounting that the airplane maker abetted by regulators even compromised safety in its quest to compete with rival Airbus.

Those competitive pressures have extended to technologies ranging from botched robotic surgery to runaway autonomous vehicles. As more horror stories emerge, might the technology pendulum swing back toward a greater emphasis on reliability and safety?

Consumer recognition of the problem might be the first step. A recent survey by Barclays Investment Bank found that 44 percent of fliers would wait at least a year to fly the 737 Max. Just 20 percent said they would fly the troubled aircraft if and when it reenters service, while 52 percent said they would book a flight on another aircraft.

“Yes, we are absolutely trading reliability to get products to market in the driverless car space, but the situation with airplanes is a little more complicated,” noted Mary “Missy” Cummings, a Duke University professor of electrical and computer engineering. “It’s less about time to market as it is to reduce costs."

On the basis of 'equivalence'

Cummings and others have highlighted the unsettling shift toward self-certification of new products on the basis of equivalence — that is, new aircraft or medical procedures are considered equivalent despite clear differences. In the case of the 737 Max, Boeing effectively hid the existence of a now-notorious piece of software known as the Maneuvering Characteristics Augmentation System, or MCAS.


Recommended
Market Interests Before Engineering Integrity


“The 737 MAX is not the only airplane to be approved based on equivalence,” Cummings warns. “The real question is how many other aircraft from any manufacturer have received such approvals?”

ethiopia airlines

The aftermath of the March 10, 2019, crash of Ethiopian Airlines Flight 302, killing all 157 passengers and crew. (Image: NPR.org)

Software developer and early Boeing critic Gregory Travis has attributed at least some of the slipshod engineering to “cultural laziness” within the software development community. Just push out a patch next week if a bug surfaces this week, seems to be the prevailing attitude.

“Because more and more of the hardware that we create is monitored by and controlled by software, that cultural laziness is now creeping into ‘hard engineering’… Like building jet airliners,” Travis observed shortly after the Ethiopian Airlines 737 Max crash in March 2019.

Others aren’t so sure, still willing to give Boeing the benefit of the doubt. In declining to comment on “newspaper rumors” about the 737 Max design, Nancy Leveson, a respected MIT professor of aeronautics and astronautics and an authority on systems and software safety, said: “I have not seen the actual [MCAS] design and I doubt that most of the people commenting on it have either.”

Bit flip?

Fair enough. But a new problem has since surfaced with 737 Max flight control system. On the case at last, the U.S. Federal Aviation Administration uncovered a new potential hardware fault in the 737 Max flight control computer. The fault reportedly involves the random flipping of bits in the microprocessor, likely caused by radiation striking chip circuitry.

The phenomenon is widely known. A single bit flip was alleged to have caused sudden acceleration in a Toyota Camry in a wrongful death case then ended up in an Oklahoma court in 2013.

What irks experienced pilots, approached by EE Times, is not only the concealment of MCAS but the seemingly cavalier manner in which Boeing engineers fundamentally changed the 737 flight-control architecture.

“Flight controls are life and death; they're the limbic system of an airplane,” said one retired Air Force pilot. “Many different aircraft have stability augmentation systems on one or more axes. You could even say an autopilot has much the same role.

“My experience with them is they're grudgingly, guardedly permitted to operate, and pilots practice overriding them until it's instinctive. But that requires knowing all about them,” the veteran pilot stressed.

The confluence of market pressures, slipshod engineering, a mission-critical application (air transportation), a lax regulatory regime and faulty risk analysis have underscored the potentially deadly consequences of rushing technologies to market. In a recent study conducted for NASA, Duke’s Cummings noted: “There is a complex relationship between lower-level engineers developing new or improved technologies, the agencies that oversee these projects and the stakeholders who use and are affected by this technology.”

Concluded Cummings: “While too much regulation can have the unintended side effect of stifling innovation, when an oversight agency leans too much on the side of technology developers, then stakeholders affected by the technology can be at a disadvantage.”

Or, in the case of 346 passengers and crew aboard two Boeing 737 Max aircraft, dead.

Boeing’s ordeal since the 737 Max crashes has undoubtedly generated a ripple effect that might engender a kind of enlightened self-interest among technology manufacturers. Even if corporations act in their own self-interest to avoid bad publicity, lawsuits and financial ruin — as of mid-August, Bloomberg estimated Boeing’s losses attributed to abandoned 737 sales and related costs at $8.3 billion and climbing — we consumers could still benefit in the long run from increased scrutiny that might just yield greater reliability and safety.

Perhaps the 737 Max mess and the stunning lack of regulatory oversight by federal regulators will have a salutary effect on manufacturers, prompting them to at last resist competitive pressures to produce airplanes, robots and driverless cars that are more reliable and therefore safer.

Reliability

Indeed, reliability is emerging as a market differentiator. A survey released in August by IHS Markit identified reliability as the No. 1 driver for datacenter investments, topping considerations such as application performance or security. Reliability was cited by 78 percent of those polled, reflecting what the survey said was “the importance enterprises are placing on ensuring their networks run seamlessly.”

Of course, a datacenter outage isn’t a life-or-death situation. But living up to service-level agreements might be a small, first step toward greater technology reliability.

As the Max 737 saga shows, the tension between bean counters and metal benders has reached unprecedented levels. But the avoidable Lion Air and Ethiopian Airlines tragedies paradoxically provide an opening for engineers to reassert the absolute necessity for reliability and safety in every product they design and manufacture.