If you design automotive systems, you may feel tremendous pressure to redefine vehicle powertrain systems to meet the reduced greenhouse gas (GHG) emissions goals established by initiatives in the European Union, United States, China and Japan, just to name a few. To meet these targets, some automobile manufacturers are implementing 48-V systems for mild hybrid electric vehicles, but most are transitioning traditional internal combustion engine (ICE) models to an all-electric powertrain system, as shown in Figure 1.
Due to environmental goals for emission reduction, electric vehicles (EVs) are projected to quickly increase their market share over the next decade. To help speed the adoption of EVs, powertrain suppliers are designing integrated electric powertrains that include the traction inverter, DC/DC converter and onboard charger in one box. While this approach reduces vehicle weight, lowers overall costs and provides higher power density, it requires greater focus on thermal management. This is because higher power densities and smaller form factors increase operating temperature, which can degrade powertrain system performance – or even worse, damage it. Temperature sensors have always been a necessary component within vehicle powertrains, but because they’re often a second thought or reused across designs for years, their positive impact on electric powertrain systems has been limited. New temperature sensor technology with proper implementation can have a great impact on the efficiency and reliability of integrated electric powertrain systems.
Figure 1: Electric powertrain systems
Better temperature accuracy can help increase the efficiency of integrated electric powertrains
Consumers want EVs to charge more quickly and to have more driving range on a single charge. In order to achieve this, designers must increase the efficiency of the integrated electric powertrain. One approach is to increase the voltage levels and switching frequencies within the power stages. However, doing so in such a small footprint intensifies the power density and causes temperatures to rise, adding to the risk of thermal runaway. Achieving this increase in efficiency while limiting the risk of thermal damage requires the proper use of accurate temperature sensors and active cooling feedback loops.
Safely pushing components to their peak performance typically results in higher operating temperatures closer to device limits. As shown in Figure 2, maximizing sensor accuracy enables the processor to minimize the temperature safety margin of error so that it can more precisely control functions such as vehicle charging and power flow – even when close to the thermal operating limits of your devices.
Figure 2: How accuracy affects safety margins
Low-drift temperature sensors help maintain the reliability of integrated powertrain systems over time
Consumers want their vehicles to last. The lifetime of the electronics found in an integrated electric powertrain directly correlates to the temperatures at which they were exposed. In order to keep components such as the power-stage field-effect transistors operating correctly for many years, temperature sensors must be reliable and have minimal drift.
Many electronics drift over time, and temperature sensors are no exception. Drift has a lot to do with a sensor’s material composition. For example, silicon-based temperature sensors have negligible sensor drift over time, while resistance temperature detectors drift anywhere from ±0.1 to ±0.5°C per year, and traditional negative temperature coefficient (NTC) thermistors typically drift >5% over time (not including the drift of external components). As your system ages, sensor drift can increase the error of your temperature sensing solution and limit its efficiency by forcing it to shut down prematurely or cause thermal damage to components.
When designing for long-term performance, it’s best to use a temperature sensor integrated circuit (IC) that provides guaranteed accuracy over its full operating temperature range, as opposed to the questionable accuracy that discrete implementations such as NTC thermistors have when uncalibrated. A digital temperature sensor such as the TMP126-Q1 enables precise monitoring up to extremes as high as 175°C with ±1°C accuracy, and down to ±0.3°C across a wide temperature range. In the case of voltage transients and surges within the system, temperatures can increase pretty quickly.
As Figure 3 illustrates, the TMP126-Q1 helps your system take preemptive action by detecting temperature spikes, which may be a leading indicator of thermal issues before the temperature reaches dangerous levels.
Figure 3: The new slew-rate alert feature in the TMP126
As I mentioned previously, you might want to increase switching frequencies to boost efficiency, but that may lead to unwanted electromagnetic interference (EMI). To operate in high EMI conditions, the TMP126-Q1 has a built-in cyclic redundancy check to ensure the use of data only when it is error-free. Additionally, the TMP126-Q1 comes with functional safety failure-in-time (FIT) rate and failure mode distribution (FMD) documentation to help achieve system-level certification. The device’s high accuracy and negligible drift enable integrated powertrains to reduce their protection margins, shutting down less often while still protecting the system from overheating, leading to greater efficiency.
It’s not surprising that you can find over 20 temperature sensors spread across integrated electric powertrain systems. Accurate sensing can help maximize your system’s efficiency, prevent false triggering of control systems, and protect ICs or other components more effectively. Being able to maintain that efficiency over time to keep automobiles on the road longer is possible with low-drift sensors made out of silicon. No matter what temperature sensor you choose, remember to choose an accurate and reliable one, follow sensor placement best practices, and leverage new sensor features.