A recap of trends and technologies fueling the electric and autonomous automotive industry...
For several years now, the CASE acronym has been driving the automotive industry. It refers to “Connected” cars, “Autonomous/Automated” driving, “Shared,” and “Electric.” Among these four axes of development, it seems that two are gathering more Research & Development (R&D) compared to others.
The first one is related to the electrification of OEMs’ fleets. Indeed, the EU has set CO2 emission reduction targets: 95g/km in 2021 and 81 g/km in 2025. With these targets, OEMs have few alternatives but to electrify vehicles.
The second one is related to autonomous/automated driving and how OEMs are integrating increasingly automated driving functions to reach the long-term goal of full autonomy. OEMs are expected to achieve this goal with the addition of many sensors and advancements in computing performance necessary to process all the data generated by the sensors.
Drivers for Car Electrification
The electric and hybrid electric vehicle (EV/HEV) market is growing rapidly and will reach more than 30 million new vehicles in 2025, according to Yole Développement’s estimates.
Altruistically, developing electrification to address social and environmental challenges is a response to several strong and sustainable drivers (Figure 1).
Among the most important benefits are the reduction of CO2 emissions from passenger vehicles and the reduction of air pollution in densely populated areas. Electrification can also be pragmatic since strong governmental CO2 emission reduction targets are forcing car makers to significantly reduce their fleet’s average CO2 emission levels to avoid heavy penalties. Electrification of the vehicle’s powertrain enables significant reductions of CO2 emissions and thus becomes a mandatory part of the car makers’ strategies.
After more than two decades of “paving the way,” dominated by a handful of EV/HEV manufacturers, we have entered the phase of massive deployment of EV/HEVs by both traditional mainstream car makers and emerging OEMs.
In a market with historically high barriers to entry, electrification is enabling a host of new start-ups, though many are unlikely to reach mainstream acceptance. Car manufacturers across the world are releasing a large portfolio of electrified vehicles, giving customers the freedom of choice amongst different vehicle electrification types and vehicle designs. Driving range is increasing, pricing is becoming more affordable and most importantly, and EV charging infrastructure is being widely deployed. All these factors increase customer motivation or reduce prior reluctance to purchase an EV/HEV.
Different Types of Electric Vehicles
Among the different types of electrified vehicles, we distinguish hybrid electric vehicles, which contain both electric and internal combustion motors, and full electric vehicles with electric powertrain only. Full electric vehicles with zero CO2 emissions are considered the ultimate goal of electrification of the passenger fleet.
However, such a transition from Internal Combustion Engine (ICE) vehicles to full electric vehicles cannot be done overnight because of constraints in technology, cost, raw material, and manufacturing. The lack of charging infrastructure has also been a critical barrier.
Traditional automotive OEMs are progressively adopting electrified vehicles at the risk of cannibalizing their ICE car sales. They must bear the burden of existing ICE-related manufacturing facilities, employees, distribution, and sales networks. They must develop new intellectual property, engineering, and capacity to manufacture electric vehicles, often in isolation from traditional ICE manufacturing.
The whole automotive supply chain must be rebuilt, and this will take some time. Therefore, a transition period exists where vehicles with different levels of electrification are offered to customers (Figure 2). Hybrid electric vehicles offer various levels of CO2 emission reductions compared to an ICE vehicle, from a small percentage for mild-hybrid electric vehicles (MHEV) up to about 50% for plug-in hybrid electric vehicles (PHEVs).
The transition from HEVs to full EVs
In the past two decades, the EV/HEV industry was dominated by full-hybrid electric vehicles (HEVs), mainly driven by the commercial success of the Toyota Prius introduced in 1997. As a result of strong governmental CO2 emission reduction targets, improvements of Li-ion batteries and their cost reduction as well as many other factors (Figure 3), the automotive industry is following an “accelerated electrification trend” – a faster transition toward vehicles with longer electric-mode driving range (mileage) such as PHEVs and BEVs.
Indeed, PHEVs and full EVs offer the significant reduction of CO2 emission required by car makers to reach CO2 emission reduction targets and avoid heavy penalties.
Both PHEVs and BEVs are rechargeable vehicles; they can be charged from the electricity grid, using preferably electricity generated by clean energy sources, such as photovoltaics, wind, and hydroelectricity. PHEVs have an electric mode driving range of approximately 50 to 80 km. Because PHEVs can be used in full electric mode, they also reduce emissions of air pollution in cities, but that contribution can be limited with the full electric mode only suitable for short-distance city commuters.
To enable longer electric-mode driving range, reduce the CO2 emissions and improve the environmental impact of e-mobility, innovations are needed in batteries, power electronic devices and systems and car designs.
The main trends related to power electronics technology include the adoption of SiC and GaN semiconductor technologies in the traction inverters, onboard chargers, and DC-DC converters, enhanced device packaging technologies (silver sintering die attach, copper wire bonding, wire-bond free interconnections, Si3N4 AMB (Active Metal Braze) ceramic substrates, high-temperature epoxy encapsulation) and mechatronic integration of different systems.
The key objectives for improved range are to increase the power density while reducing the volume and weight. Other developments serve to improve overall reliability and reduce manufacturing costs. While some of the increased popularity of electrification is being driven by heightened consumer awareness, a significant attribution should be granted to accelerated engineering pace of the automotive industry.
Sensor Trends for Electrification and Advanced Driver Assist Systems (ADAS)
The trend toward the increasing use of sensors in vehicles is another accelerated trend.
There are two main lines of focus for sensors under the CASE umbrella: electrification and ADAS functionalities including autonomy, positioning, and vehicle-to-everything (V2x) connectivity. The increase in sensors is part of the new electric/electronic (E/E) architecture and the evolution of ADAS systems. Like emission control, safety improvements are being pushed by government regulation — for example by the New Assessment Car Program (NCAP) and the National Highway Traffic Safety Administration (NHTSA) — or by European directives.
For the electrification of the vehicle, more magnetic (current) sensors will be used for battery monitoring (battery management systems). Classic powertrain pressure sensors — pressure for MAP, BAP, transmission oil, GPF, etc. — could suffer due to the EV trend reducing reliance on an ICE, but this should evolve over the longer term. For the moment, the trend towards HEV, with its combined powertrain, offers a sweet spot for the integration of both types of sensors, new ones and classic ones — optical for heads-up display (HUD), magnetic for wheel torque, inertial for odometry and navigation, pressure for powertrain monitoring, current sensors for electric powertrain, etc.
In addition to electrification, ADAS is another driver of sensor density in all classes of vehicles. To give more autonomy to the car, it should develop its own situational awareness. For that, appropriate sensors are needed to allow it to emulate how humans perceive their environment.
It should be clarified, however, that the sensor suite for ADAS in consumer automobile markets is very different from full autonomy in mobility-as-a-service (MaaS) business models such as Waymo, Cruise, etc. The latter use high-cost, high-performance, industrial-grade sensors and advanced processing platforms that exceed that of consumer vehicles by an order of magnitude cost and energy consumption. The liability for these vehicle owners warrants the added cost as much as demonstrating technological leadership. For ADAS, the key is to offer very good performance at a low cost. Below we list some of the most important ADAS sensors:
Image Sensors are becoming the most important sensor in ADAS. These cameras are the eyes of the car. Automotive cameras almost ubiquitously use complementary metal oxide semiconductor (CMOS) image sensors. There is a growing attachment rate for automotive cameras to help with various functionalities:
Forward view: The trend from one long range camera toward triple cameras expanding range and ADAS scope is ongoing.
Surround view: From rear cameras that only addressed reverse and parking assist, the trend is to more of a 360° view around the car for better situational awareness (reverse driving, parking in a confined space, turning, and other maneuvers).
In-cabin driver monitoring: According to Euro NCAP 2025 roadmap, driver monitoring (start-date 2020) is proposed to mitigate driver distraction and impairment through alcohol, fatigue, or other hazards, but also monitor passengers and detect the presence of forgotten children or animals to avoid heat stroke or similar incidents. However, other technologies compete for in-cabin monitoring dominance such as thermal sensors, time-of-flight (ToF), or radar.
Ultrasonic sensors are mainly used for short range detection of obstacles or pedestrians. Their main functionality is for parking assist and collision avoidance. They could potentially be used for gesture recognition for infotainment applications in-cabin, though there is a competition with other technologies including ToF cameras.
Radar remains the best available sensing technology to measure relative distance and speed and was largely driven by the mandatory implementation of automated emergency braking (AEB) during the past years. It is continuously improving angular resolution and various players are working towards 3D radar (Bosch, Continental have already released products) or 4D imaging radar and monolithic microwave integrated circuit (MMIC) solutions.
LiDAR (light detection and ranging) is the most recent sensor to enter the automotive ecosystem. Its ability to work in dark conditions and monitor the surroundings precisely could make it a necessary sensor for development of more advanced perception. The efficient fusion of data from ADAS cameras and LiDAR is important to enable both detection of the depth of objects as well as of objects at both short and long distances. However, LiDAR is still a very complex sensor for Tier 1 and OEMs to integrate compared to radar and cameras. Much more work is needed for better integration, data processing, or cost reduction to be widely adopted by OEMs.
Thermal cameras/sensors that “look” in the far-infrared spectrum (8-14µm) are significantly more proficient in low-light conditions. Dusk is particularly difficult for human drivers to handle and visible ADAS cameras sometimes encounter difficulties when there is direct sun exposure, deep shadows, and low ambient light. The use of these sensors could complement the CMOS sensors by detecting the emanated heat from living beings and other heat-generating objects, improving the car’s safety functions. FLIR, the biggest thermal camera manufacturer has advocated for the benefits of thermal imaging in ADAS as a redundant sensor and has shown the benefits for a more efficient AEB and better pedestrian detection.
Global Navigation Satellite System (GNSS) receivers use connectivity to enhance ADAS by locating the vehicle on a digital map. This is crucial to ADAS having situational awareness. However, at some cases (urban canyons, tunnels, below trees, etc.), the GNSS signal could be lost. Therefore, it is increasingly combined with other positioning systems.
Odometry sensors located at the wheel record distance from the last known position.
Inertial measurement units (IMU) detect inertial changes to calculate location from the last known position.
The last two sensors listed are short-distance backups that help estimate the position of the vehicle when GNSS is unavailable leading to improved accuracy in positioning at all times.
Besides ADAS, things are also changing for car connectivity and E/E (electric/electronic) architecture. Regarding connectivity, regulations are pushing for basic vehicle-to-infrastructure (V2I) and vehicle-to-vehicle (V2V) communication based on the 5.9 GHz frequency band to be implemented in most countries by 2023. 5G connectivity is gaining momentum for advanced V2X.
E/E architecture historically relied on dozens of distributed but distinct automotive subsystems known as electronic control units (ECU).
Traditionally, each ECU was focused on one function and there was little or no connection between ECUs. This practice is still common today with less advanced E/E architecture. With the increase in the number of sensors and functions, the E/E architecture will increasingly make use of the function domain concept — specific domain controllers for infotainment, comfort, powertrain, sensing, safety.
Audi was the first OEM to implement a domain controller in its A8 model (the zFAS module) where the front sensors i.e., the long-range radar, the forward ADAS camera, and the LiDAR are connected to it. With functional domain controllers, functions such as ADAS can integrate data from very disparate sensory systems and incorporate advanced neural networking computations to make low-latency, advanced rational decisions to avoid collisions and other catastrophic failures.
With high penetration rates of radar and cameras in cars, the associated revenues will recover rapidly from the coronavirus crisis. Radar market revenue is expected to surpass 2019 revenue in 2021 and will reach US $9.1 billion in 2025, at a Compound Annual Growth Rate (CAGR) of 19%. Camera market revenue will also surpass 2019 revenue in 2021 and will reach US $8.1 billion in 2025 at a CAGR of 18%. LiDAR revenue is quite limited today as only one OEM is implementing this sensor as an option in some of its cars. Other OEMs, such as BMW and Volvo, are expected to follow in the coming years, but the implementation will remain limited to high-end vehicles, and therefore limited volumes are expected. In this context, LiDAR market revenue is expected to reach US $1.7 billion in 2025 at a CAGR of 113%.
The Marriage of AI and Computing is at the Heart of the Race for Autonomy
The increasing number of functionalities and their evolving complexity requires the contribution of dedicated hardware solutions for the type of software processing this functional situational awareness previously introduced.
Firstly, in terms of software, it is becoming increasingly apparent that the standard scalar processors — such as microprocessors, microcontrollers, and other application processors that we have become accustomed to for computers and embedded applications — are not capable of handling the massive increase in real-time data necessary for situational awareness and autonomous control at automotive speeds and conditions. It takes a paradigm shift in processing from scalar, to vectors and ultimately matrix processing with massive parallel processing architectures and low-latency, big data computations to accomplish. Field programmable gate arrays (FPGA) are increasingly being used to aggregate the fusion of data and to implement configurable parallel processing.
Another rapidly emerging processor solution for ADAS (and many other big data applications) is the neural network (NN) accelerator. For many being introduced to neural networks, these accelerators can seem like a form of “black box.” The unfamiliarity with the logic used by these processors and the extreme complexities involved in creating predictable and reproducible safety testing can intensify this perception. This can remain an obstacle to their implementation, particularly since the entire scope of this evolutionary trend is “safety first.”
Despite these initial obstacles, we clearly see the adoption of accelerators — or neural engine or neural processing unit — (different marketing name for same type of architecture) within ADAS solutions. NN algorithms have been around since the 1950s addressing such big data problems as weather forecasting, air traffic control, and complex physics equations. As the internet and social media created a new landscape for big data, we started to see cloud-based NN algorithms solving big data problems. Hyperscalers such as Google, Amazon and Microsoft began investing in NN with huge resources including developing their own hardware accelerators.
Nvidia was instrumental in developing general purpose graphics processing unit (GPU) development environments supporting accelerating NN and vector-based equations in GPUs over a decade ago. Since then, the race to develop deep neural networking (DNN) — a more advanced layered characteristic approach to neural networking — for inferencing in edge applications has emerged rapidly.
These NN engines or accelerators are fast becoming a standard subsystem in mobile phones for recognizing objects, faces, and other context to improve photos, security, and complex image analysis applications. It did not take long to realize that this is exactly what was needed for the situational awareness required for the automotive industry to tackle the biggest problem in automotive safety: driver error. Tesla, Mercedes, Audi, Volvo, and many others have been working with Tier 1 and processor suppliers to develop neural networking. Many have had vehicles enhanced with autonomous systems to train DNN solutions for many years.
Most of the early development was with Nvidia, which is still a major player in ADAS, but there have been many accelerator alternatives from Mobileye, Xilinx, TI, Toshiba, Ambarella, and Renesas — and many more developing platforms relying on DNN accelerators.
Tesla integrated these accelerators and AI into its FSD chip last year. For most OEMs, this solution will be seen by 2021-2022 since these suppliers are now integrated into the most current or future ADAS. This trend to integrate more and more AI and therefore accelerators follows the rise in autonomy in a linear fashion. Other trends such as centralization will gradually redraw the future of computing, as shown in Figure 4.
A Market Divided in Two: Centralized Platforms and Vision Processors
We realize that two paths are followed:
First, a centralized ADAS domain controller — responsible for almost all the perception computations and subsequent reasoning for safety control and autonomy — such developed by NVIDIA, Tesla, or FSD. This trend is coming down from the high-end segment of robotic vehicles. It emulates the central brain of the car.
Second, a layered approach where some of the post-sensory vision intelligence is distributed to functionally dedicated vision processors or platforms integrating DNN acceleration. The number of vision processors incorporating accelerators is multiplied. This choice is currently assumed by several OEMs. This trend will create competition between these two types of platforms and determine the revenues shown in Figure 5. The market for AI, including ADAS and robotics vehicles in 2025, is estimated at more than US $ 2.8 billion, of which US $ 2.5 billion will be ADAS only.
Winners and Losers in the Race for Autonomy
The hype and complexity around these artificial intelligence strategies has led some players to consider AI a secondary objective — difficult to achieve — rather than as a central tool to achieve the real objective: autonomy.
Finally, after an initial inventory, the players that have understood this aspect are already leading this race. The future impact of Covid is still uncertain, but we can already affirm that it has negatively impacted early 2020.
It is also likely to have a secondary penalty impacting research and development into autonomy causing delays this year and likely next year because of a lack of cash. This negative cash flow may lead a few OEMs to develop their own autonomous software stack, but the hardware is provided to them by the Tier 1 and semiconductor suppliers of this industry.
Many of those suppliers have significant development tools to accelerate mainstream development on platforms rather than just selling silicon. Lack of cash can slow down some programs, but they have been launched for several years now and will probably not be stopped even if they are late.
As for those OEMs that had a light program or have not integrated the search for autonomy at the heart of their project, it is very likely that their autonomy research programs — if they exist — will at best be postponed until after the crisis and at worst, completely stopped. Many of these may turn to a Tier 1 to buy a more developed off-the-shelf solution later. These last ones will be the big losers in this race and will have to rely even more on computing players to provide them with full autonomous solutions/functionalities but will have lost the brand identity that comes with staying at least equal to the curve.
This is only the beginning and the challenges around AI and its impact on the automotive industry is already being felt. Some are already ahead, and it will be complicated to catch up with them, especially without integrating AI and the computing that goes with it.
Auto Industry Driven by CO2 Emission Reduction and Autonomous Driving
Among the four trends driving the automotive industry, two of them are particularly important for OEMs.
The first one, related to the electrification of OEMs’ fleets, is mostly driven by the regulation to limit CO2 emissions. This will take some time since the whole automotive supply chain has to be rebuilt. Therefore, a transition period exists where vehicles with different levels of electrification are offered to customers.
The second one, related to autonomous/automated driving, is driven by the need to increase safety of vehicles and develop automated driving modes. This requires the addition of multiple sensors like cameras, radar, and LiDAR — and the need to enable data fusion so that the vehicle can clearly perceive its surroundings. To enable this fusion, computing and AI computing will have to be implemented by OEMs to reach the ultimate goal: full autonomy. To do that, OEMs have launched their own autonomous software stack or partnered with leading Tier 1 and platform developers. Others will resort to off-the-shelf, third-party software that may be less optimized than a custom one.
The combination of full autonomous and electric vehicles is still a question to be solved and OEMs will have to manage the power consumption of the embedded sensors and computing unit that could impact the range of electric vehicles. In the end, one thing is clear regarding the trends driving the automotive industry: the competitive landscape is rich in opportunity for semiconductor suppliers, Tier 1 partners, and OEMs to embrace the CASE trends driving automotive.
About the AuthorsAs part of the Photonics, Sensing & Display division at Yole Développement (Yole), Pierrick Boulay is a market and technology analyst, Sensing & Display division at Yole Développement (Yole).Dimitrios Damianos, Ph.D., is a custom project business developer and technology & market analyst at Yole.Milan Rosina, Ph.D., is principal analyst, Power Electronics and Batteries, at Yole.Tom Hackenberg is a principal analyst for Computing and Software in the Semiconductor, Memory and Computing Division at Yole.