DeepScale argues AI chips and sensor systems should come with DNNs optimized for the application. It will fundamentally change how companies buy AI technology.
When a panel of experts at SAE China's AV conference were asked "which data or lessons are you willing to share with other automakers?" The question induced a long uncomfortable pause.
At this year's Frankfurt Motor Show, a new trend has emerged: A transition from C.A.S.E. (Connected, Autonomous, Shared, Electric) to C.A.P.E. (Connected, Assisted, Personalized, Electric).
Experts in AI for autonomous vehicles poured cold water on expectations that the self-driving future is just around the corner.
The automotive industry is still in a hunt for "robust perception" that works under any conditions - including night, fog, rain, snow, black ice, etc. The view from AutoSens 2019.
As the automotive industry rushes toward assisted driving and autonomous vehicles, it seems to be forgetting something kind of important.
Bosch, Continental, Denso, GM, Nvidia, NXP, and Toyota declared their support for the new consortium.
NXP launched a deep learning toolkit called eIQ Auto. NXP is seeking to set itself apart from competitors by making its tool "automotive-quality." NXP's goal is to make it easier for AV designers to implement deep learning in vehicles.
Humans drivers are expected to be mature enough to anticipate what might happen on the road next. What expectations should we have of drivers who are not human?
Arm has announced the formation of the Autonomous Vehicle Computing Consortium (AVCC). This cooperation among companies is a good idea but do they have the necessary expertise?