BMS Innovations Will Require New EV Test Solutions

Article By : Stefano Lovati

Advances in EV technology are pushing innovation in battery management systems.

The progressive electrification of vehicles now underway on a global scale will lead to significant growth in the number of electric vehicles (EVs) in the coming years. EVs include circuits and sophisticated hardware and software solutions, in which even the smallest unexpected event can lead to abnormal, if not downright dangerous, operation. To meet the stringent requirements in terms of safety and performance, designers perform different types of tests to replicate each operating profile and use simulations that cover a wide range of scenarios, including those that are more difficult to reproduce.

Batteries and battery management systems (BMSes) are continuously changing and innovating. These developments, in turn, will require new test procedures. In a short period of time, there will be several challenges, ranging from 1,000-V–plus systems to new battery chemistries. To keep up with BMS requirements and deliver affordable solutions at large numbers of sites, providers of automatic test systems will need to build new tester capacity with quick production ramps.

The role of simulation

The high complexity of the electronic systems integrated in EVs, combined with the difficulty of reproducing all the operating conditions of the vehicle and its components, means that the testing and validation of an EV cannot entirely take place at the hardware level.

The need, therefore, arises to use operating models as faithful as possible to real devices, using virtual simulations of the functioning of the circuits and, above all, simulations of defects, malfunctions or fortuitous events capable of compromising their safety and functioning.

Engineers must be able to recreate both typical scenarios and those more difficult to replicate on the road, using simulations of the main components of an EV and hardware-in-the-loop test methodologies.

EV test objectives

By 2035, vehicles equipped with internal-combustion engines will be banned in most industrialized countries, replaced by EVs with performance comparable or superior to that of conventional vehicles and economically affordable for the mass market.

Improvement in EV efficiency is a fundamental requirement to increase the range while reducing the size and weight of batteries and has thus led to the introduction of new technologies in EVs, such as wide-bandgap (WBG) semiconductors, batteries with higher voltages (800 V and above) and wireless charging systems.

To perform the validation of an EV, it is essential to conduct functional and parametric tests not only on the individual components of the vehicle but also at the system level. Furthermore, it is necessary to carry out electromagnetic compatibility tests required by the standards, both on the single module and on the complete system.

Typical EV components to be tested are the following:

  • High-voltage battery pack: The shift from 400-V to >800-V high-voltage batteries will allow for shorter charging times (down to 15 minutes) and safer operation due to lower charging current.
  • Battery management system (BMS): This component, available with both wired (CAN/LIN) and wireless (Wi-Fi/NFC) interface, is paramount for ensuring the correct cell-to-battery integration and battery operation.
  • DC/AC traction (main) inverter and on-board charger: These are the components in which WBG semiconductors (mainly silicon carbide, but also gallium nitride for some power levels) make the difference, enabling higher efficiency, combined with size and weight savings.
  • Electric traction motor: In some applications, currently used three-phase motors will be replaced by six-phase motors, which are smaller with reduced torque ripple and higher power density.

Battery test system

Figure 1 shows the general structure of a common battery test system (BTS). The battery cycler (1) is a device that measures the battery response over time while conducting charge and discharge cycles to analyze the battery’s functionality. The battery cycler measures several factors while in use, including the battery’s efficiency, capacity and self-discharge. The environmental chamber (3) is where the devices under test are put to conduct in-chamber measurements, while the measurement rack (2) runs the software that synchronizes and coordinates the use of the battery cycles.

The chiller, which is responsible for managing the temperature and humidity inside the chamber, is also connected to it. The system orchestrator that gathers and evaluates all pertinent data is the battery test program (4), which is the next component. Users can gain instantaneous insights into the data being generated thanks to the data and system management capacity (5).

Figure 1: Components of a typical BTS

Performance and safety are the two main determining elements in battery tests. Examples of safety use cases include a crash test when it is necessary to determine whether the battery is still safe under severe strain and intentionally caused damage. Imagine that you wish to describe the battery’s range in relation to the performance aspect. To achieve this, the cycler can be designed to slightly deplete the battery to simulate regenerative braking or slightly discharge the battery to replicate a driver’s acceleration. Like this, various driving profiles and operating conditions, such as city driving, can be mimicked. In other words, the BTS enables the use of models to mimic and test equipment without the need for physical hardware. Availability enables test engineers to quickly transition between model-based simulation and actual equipment at the software level by separating development from hardware.

BMS test

The BMS monitors the battery and possible fault conditions, preventing situations in which the battery could degrade, lose its capacity or even potentially harm the user or the surrounding environment.

A BMS normally consists of three parts:

  • Analog front end (AFE): It provides the MCU and the fuel-gauge IC with the battery’s voltage, current and temperature measurements. Moreover, it controls the circuit breakers, which disconnect the battery in case of failure. The AFE is usually implemented with a multi-channel high-resolution ADC.
  • Microcontroller (MCU)
  • Fuel gauge IC: It can be a standalone IC, or it can be integrated into the MCU. It oversees the estimation of some key factors, such as the battery’s state of charge (SOC) and state of health (SOH), using voltage, current and temperature measurements.

Because a BMS is normally required for each stack of cells in a battery, an EV may include six to 12 or more BMS devices, not including redundancy devices. Each BMS device floats on the BMS or cell module below it because these devices are normally powered from the lower and upper rails in each of the cell modules. This means that each of these devices needs to communicate with the master controller, which is commonly an MCU, in a digitally separated, daisy-chained fashion.

High accuracy is required for EV batteries, which directly correlates to longer range between charges and more gradual cell aging. Increased driver confidence and safety are further benefits of more accurate fuel-gauge ICs. Furthermore, as battery stack voltages rise, more cells must be added to the stack, necessitating extra front-end ADC channels and cell-balancing pins per BMS device. Soon, these voltages are anticipated to increase to 1,000 V and higher, leading to faster charging that will enable EVs to achieve recharging times comparable to the time required to refuel an internal-combustion engine.

For automated test equipment, these trends in BMSes pose additional difficulties. Most of the usable area falls along a tight curve when measuring a battery’s discharge curve. The entire Li-ion SOC range is between 4.3 V (fully charged) and 2.2 V (discharged). It seems simple to measure the change while looking at the entire Li-ion range (approximately 2.1-V voltage range, or 21-mV/1% SOC change).

An average Li-ion discharge consumes between 80% and 20% or 90% and 10% of the battery’s capacity. The SOC voltage is remarkably flat in the 80% to 20% range, ranging from 3.75 to 3.65 V (about 100 mV total, or 1.7-mV/1% SOC change). Therefore, BMS suppliers are investigating measurement accuracy in the 5-V range of 100 µV or 50 µV. A typical voltage curve for Li-ion discharge is shown in Figure 2.

Figure 2: Typical voltage curve for Li-ion discharge

As mentioned before, in EV testing, simulation is paramount. The simulation of a BMS included not only the battery’s normal operation in static and dynamic conditions but the characterizations that determine its longevity and reliability. An excellent simulation must consider all of the current variables, including hardware, logic-signal levels and power levels. These battery tests allow for the monitoring of system temperatures as well as the charge/discharge cycle of the accumulator cell. Therefore, adjustments and variations in operation can be done to check how the battery itself responds to environmental changes, notably for security.

It is a very risky location where a battery pack may catch fire, rendering all safety features useless. For instance, heat leaks, which are especially common with Li-ion batteries, can happen at any time and are particularly challenging to put out. The crucial choice in terms of energy and safety is how quickly a battery can charge and discharge to efficiently supply all the energy to the inverter. Cost-cutting efforts should focus on reducing manufacturing waste and the number of defective batteries because the different preliminary tests enable engineers to mimic the entire system from the beginning and have all clear ideas.

 

This article was originally published on EE Times.

Leave a comment