Tesla’s Autonomy Investor Day -- a carefully designed and expertly promoted live streaming event -- took place Monday in Palo Alto, Calif. just two days prior to the announcement of Tesla’s latest quarterly financial results.

It remains to be seen if the event boosted buyer confidence so high that investors will turn a blind eye to Tesla’s current financial status. In its Q1 2019 earnings announcement Wednesday, Tesla is expected to report a loss. The company recently disclosed that its vehicle sales fell 31 percent from three months ago.

Nonetheless, the event helped generate plenty of ink on Tesla. The newly disclosed technical details impressed some witnesses while leaving others skeptical.

Tesla used the opportunity to discuss everything from a home-grown deep learning accelerator and neural networks to what Tesla calls “full self-driving (FSD)” features and a fleet of driverless “robo-taxis” Elon Musk promised to deliver by the end of 2020. The dizzying array of topics dished out by Musk illustrated Tesla’s kitchen-sink approach toward autonomous vehicles, reminding many in the industry that Tesla is no ordinary car OEM.

Here’s a summary of what exactly Tesla announced, what the company did not say, and which claims left some industry observers skeptical.

Brute-force approach

Tesla unveiled details of its very first chip designed for a “Full Self Driving” computer.

tesla FSD computer module

Tesla's Full Self-Driving Computer Module (Source: Tesla)

As Ian Riches, vice president of global automotive practice at Strategy Analytics, put it, Tesla announced “effectively the most powerful computer yet fitted to a production vehicle."

Peter Bannon, Tesla’s director, took the lead, describing in details Tesla’s two-chip system. It delivers a total of 144 trillion operation per second (TOPS), drawing 72W.

He gave a detailed tour of the new chip – a 260 square millimeter piece of silicon featuring 250 million gates and 6 billion transistors -- fabricated by Samsung in Austin, Texas, using a 14nm FinFET CMOS process technology.

Bannon joined Tesla three years ago from Apple, where he served a critical role as a chip architect, enabling Apple to become a vertically integrated company with its own application processors.

Riches observed that a chip designed by Bannon's team is a combination of cores dedicated to neural net acceleration alongside CPU and GPU cores. “In a sense, it is not conceptually different to what Nvidia offers,” he added.

Mike Demler, senior analyst at The Linley Group, however, described the Tesla’s chip as taking “more a brute-force approach [than Nvidia] to running convolutional neural networks.” These two new chips run in parallel, [to offer] full processor redundancy.

Drawing the inevitable comparison, Demler said the major difference between Tesla’s new chip and Nvidia’s Xavier is that Tesla uses a much bigger inference engine.

Demler noted, Tesla “designed their own deep-learning accelerator (DLA), with 9,216 MACs/DLA, and there are 2 of them on each chip.” In comparison, Xavier has 2 DLAs with 2,048 MACs, along with a Pascal GPU, he added. The bottom line is that Tesla’s chip offers 9x the dedicated DLA MACs in Xavier.

Tesla Neural Network Processor

Tesla Neural Network Processor (Source: Tesla)

What puzzled Demler, however, is Tesla’s claim of 21x the frame/second performance. “That doesn’t make sense, even if you divide by two to eliminate the redundancy,” he said. “Not clear what part of the Xavier chip Tesla is using [for comparison],” he added.

Details of an apple-to-apple comparison in specifications of the rival chips might have to wait for later analysis.

Tesla’s apparent lead in power efficiency is, however, one thing many seem to agree. Strategy Analytics’ Riches noted that Tesla’s stated 144 TOPS for 72W of power consumption effectively translates into 0.5 W per TOPS. Nvidia DRIVE AGX Xavier, in contrast, “can offer 30 TOPS (not 21 as stated by Tesla), at a power consumption of 30W (or 1W per TOPS),” said Riches.

There is one more thing to consider, though. Riches highlighted that Tesla and Nvidia differ in the setups for their chips.

On one hand, Tesla’s setup “is designed for assisted, not fully automated driving.” On the other, “Nvidia is offering fully automated driving via the Drive AGX Pegasus,” Riches added. Drive AGX Pegasus provides 320 TOPS (so approx. twice the performance of Tesla), but at significantly higher power consumption (often quoted at around 500W, or 1.6W per TOPS),” he summed up.

During the event, Tesla’s Peter Bannon overwhelmed the audience with technical details about their new AI processor. Phil Magney, founder & Principal Advisor at VSI Labs, observed, “I suspect the detail was by design.”

For sure, all the performance specs spouted by Tesla for its chip “sounded impressive from a technical standpoint” to most people, Magney said. “But the event might raise more questions than answers.”

Captive chip business model?

Tesla is following Apple’s playbook by designing its own chips. But will it work for Tesla whose business hasn’t exactly taken off yet?

Demler said, “That was pretty interesting.” What Demler heard was, “Pete Bannon said he asked Elon if he was ready to take on the expense of building a custom SoC. As long as it was the best, which they claim it is, he said at 1 million cars a year… sure. Let’s do it.”

But here’s the rub. “Tesla only produced ~250K cars last year, so they have a long way to go on breaking even. Just more Elon braggadocio,” said Demler.

Tesla is one company never afraid of being boastful, though. Quoting Tesla’s claim that “there was no other chip on the market that would be as efficient as their own design,” Magney said, “Tesla is banking on this architecture to take them beyond the limitations of their current Nvidia powered device. They are already saying that the next generation of this chip will be ready in about three years.”

Truth about Full Self Driving (FSD)

As though AutoPilot weren’t confusing enough to the public, Tesla is now introducing another Tesla-defined driver assisted features called Full Self Driving (FSD). It’s important to note that FSD does not match the automotive industry definition of Level 4 or Level 5 autonomous vehicles.

Magney explained, “FSD is essentially a feature set more commonly considered Level 3 with its ability to handle automated driving within the context of a highway.” Magney stressed that “the driver stays in the loop.”

He added, “Some of the features that used to be in the base AutoPilot are now part of FSD (such as enhanced summons and Navigate).”

As Tesla explains, ‘Enhanced Summon’ is a parking assist feature designed to vehicles to navigate a parking lot autonomously and find its driver. ‘Navigate’ is an active guidance system that navigates a car from a highway on-ramp to off-ramp, including interchanges and making lane changes. But most consumers – unless they are avid Tesla fans – might not necessarily see such features as “full self driving,” because after all, people have to drive their cars.

Tesla is doubling down, though. It is now saying that Autopilot 3.0 is scalable and does have the ability to eventually reach L4. In Musk’s mind, all incremental improvements are possible on Tesla’s architecture, because that’s exactly how the new chip is designed.

What about safety?

Tesla not being a conventional car company might be able to let the company get away with this, but safety obviously wasn’t exactly one of their talking points.

Magney said, “For safety, the new AutoPilot module includes dual core lockstep processing which is pretty important, although not unique.” He added, “Tesla says their chip is AECQ100 certified which is mainly a thermal rating. No mention was made about Functional Safety or ASIL rated componentry.”

Other than that, Magney observed, “But from a best practices standpoint Tesla said all the right things such as freedom from interference, embedded security software, and efficiency of the new hardware.”

Tesla’s Robo-taxi plan

Many media coverage after Tesla’s Autonomy Investor Day zoomed in on Tesla’s Robo-taxi plan. Largely it’s because this was the first time that Musk discussed such a plan, promising a launch by the end of 2020.

Magney, however, made it very clear that calling Tesla’s planned service a robo-taxi is “a stretch.”

In fact, “This is not a robo-taxi service as we have come to know it,” he said, because no driverless Tesla fleets are running around out there. In fact, this is really more of a car sharing service, said Magney.

Here’s how Tesla’s plan works. “At first you can lend your car to the fleet and earn some extra cash. Until regulation permits, the Tesla Robo-Taxi is like any other L2+ vehicle on the road that you can rent. You drive it and you insure it!”

He added, “Tesla will enable this feature through a new app that allow you to share your car with others for a fee while Tesla earn a commission of 25-30%. Magney views the app as “shrewd.” But at least, , Tesla has created another source of income, he noted.

Tesla’s data collection

“Tesla made many references to the fleet of Tesla vehicles and their ability to collect more real-world data than anyone,” Magney observed. He believes this is noteworthy because Magney’s company, VSI Labs, has always felt this was a huge asset and advantage over others.

He explained, “Tesla can interrogate a scene if it wants to. In other words, they collect data under pre-defined scenarios when the car gets into vulnerable situations. And the use of ‘shadow mode’ operation is big deal for Tesla. In other words, Autopilot (AP) is always running whether engaged or not. Constantly comparing human input with AP outputs.” Further, “Elon downplayed simulation basically saying you cannot simulate the unknowns,” he added.

‘Lidar is a fool’s errand’

Tesla’s CEO did not hold back in defending his long-standing view on lidars. Musk reiterated Tesla’s position by dismissing lidar as “friggin’ stupid.”

“Tesla’s use of eight cameras plus radar gets the job done, more or less,” Magney said. He explained that radar attaches velocity information onto each labeled object that the camera picks up within the trajectory of the vehicle.

But there is another reason Tesla does not need Lidar, Magney pointed out. Teslas “do not do localization against a geo-mapped area. All path planning is done holistically from the trained network.”

This might surprise some believers of HD maps. “This is not to say [Tesla] don’t use mapping assets, but they specifically stated they don’t believe in HD maps because they don’t have enough confidence in the data,” according to Magney. “Not sure how they will accomplish intersection traversal. Perhaps they train for proper trajectories and geo-code this, but they are not calling this a map.”