In the context of global carbon neutrality initiatives, the adoption of renewable energy sources such as wind and solar power has accelerated. However, their inherent intermittency necessitates efficient energy storage systems. Lithium-ion batteries, with their high energy density and long cycle life, have become pivotal in both grid-scale energy storage and electric vehicle (EV) applications. The safety and reliability of these batteries, particularly in large-scale deployments like EV battery packs, are paramount. Thermal runaway poses a significant threat, potentially leading to fires or explosions. This phenomenon typically progresses through three stages: heat accumulation, thermal balance, and thermal runaway. Delaying the onset of thermal runaway, especially during the heat accumulation phase, is critical for enhancing safety. In this study, we focus on optimizing liquid cooling systems for an EV battery pack to extend the heat accumulation period, thereby improving overall system stability and longevity.
Thermal runaway in lithium-ion batteries is a complex process driven by exothermic reactions. Based on experimental data, the temperature evolution can be divided into three distinct phases. The first phase, heat accumulation, involves a gradual temperature rise as internal heat generation outpaces dissipation. The second phase, thermal balance, is characterized by a temporary slowdown in temperature increase, often due to safety vent activation, followed by a resurgence in heating rate as side reactions intensify. The final phase, thermal runaway, sees an exponential temperature spike, leading to catastrophic failure. For instance, in a 90% state-of-charge (SOC) condition, the critical temperature transition from heat accumulation to thermal balance occurs at approximately 128°C (t1), with thermal runaway initiating around 156°C (t2). Understanding these stages is essential for designing effective thermal management systems, particularly for densely packed EV battery packs where heat dissipation is challenging.

To simulate thermal behavior, we employ numerical models that account for heat generation and transfer. The heat generation rate in a battery cell can be described using the Bernardi model, which is widely applied in EV battery pack analyses. The equation for volumetric heat generation is given by:
$$ q = \frac{1}{V} \left[ (U_0 – U) – T \frac{dU_0}{dT} \right] $$
where \( q \) is the heat generation per unit volume, \( V \) is the battery volume, \( U_0 \) is the open-circuit voltage, \( U \) is the terminal voltage, \( T \) is temperature, and \( \frac{dU_0}{dT} \) is a temperature-dependent coefficient, often negligible. For normal operating cells in an EV battery pack, this model helps estimate steady-state heat output. However, during thermal runaway initiation, heat generation becomes non-steady. To model the heat accumulation phase, we use a time-dependent power source derived from experimental data. For a cell undergoing thermal runaway, the heat generation rate follows a linear fit:
$$ q = 832 \cdot t + 259000 $$
where \( t \) is time in seconds. This formulation accurately replicates the gradual temperature rise observed in tests, with an average error of 2.13% compared to experimental values.
Heat conduction within the EV battery pack is governed by the three-dimensional unsteady heat conduction equation:
$$ \rho C_p \frac{\partial T}{\partial \tau} = \lambda \left( \frac{\partial^2 T}{\partial x^2} + \frac{\partial^2 T}{\partial y^2} + \frac{\partial^2 T}{\partial z^2} \right) $$
Here, \( \rho \) is density, \( C_p \) is specific heat capacity, \( \lambda \) is thermal conductivity, and \( x, y, z \) are spatial coordinates. For convective cooling via liquid cold plates, the local heat transfer coefficient for flow over a flat plate is used:
$$ h = 0.332 \frac{\lambda}{x} Re^{1/2} Pr^{1/3} $$
where \( Re \) is Reynolds number and \( Pr \) is Prandtl number. These equations form the basis for computational fluid dynamics (CFD) simulations of the EV battery pack thermal management system.
The physical properties of battery components, cold plates, and coolant are crucial for accurate simulation. Below is a summary table of material properties used in our model, which are assumed constant for simplicity:
| Component | Density (kg/m³) | Specific Heat (J/kg·K) | Thermal Conductivity (W/m·K) | Dynamic Viscosity (Pa·s) |
|---|---|---|---|---|
| Battery Core | 2000 | 1100 | X=1, Y=21, Z=21 | – |
| Positive Electrode | 2719 | 871 | 202.4 | – |
| Negative Electrode | 8978 | 381 | 387.6 | – |
| Cold Plate | 2707 | 903 | 237.0 | – |
| Coolant (50% Glycol) | 1073 | 3281 | 0.42 | 3.9×10⁻⁵ |
We first analyze a conventional liquid cooling configuration commonly used in EV battery packs. This design features a single, elongated cooling channel embedded in a bottom-mounted cold plate. The battery pack consists of 104 cells arranged in 26 columns and 4 rows, with each cell measuring 174 mm in length, 65 mm in thickness, and 207 mm in height. The coolant, a 50% ethylene glycol-water mixture, enters at one end and exits at the other, flowing through the entire channel. To simulate thermal runaway, we designate cell 63 (located at row 3, column 16) as the abnormal heat source, applying the time-dependent heat generation formula. Other cells operate under normal discharge conditions with a heat generation rate of 10,475 W/m³. Boundary conditions include a velocity inlet, pressure outlet, and natural convection with ambient air at 30°C and a heat transfer coefficient of 5 W/(m²·K). The SST k-ω turbulence model is employed for fluid flow resolution.
Simulations with an inlet coolant velocity of 2 m/s reveal that cell 63 reaches the critical temperature t1 (128°C) in 556 seconds. Temperature distribution shows a hotspot around cell 63, with heat spreading to adjacent columns. To delay this, we increase the overall coolant flow rate. However, as summarized in the table below, even at 8 m/s, the time to reach t1 only extends to 561.35 seconds—a mere 5.35-second improvement. This marginal gain comes at a high cost: excessive coolant consumption and potential overcooling of normal cells, which could degrade the EV battery pack lifespan. Moreover, the pressure drop in the single channel escalates dramatically with flow rate, reaching 135,899.62 Pa at 8 m/s, indicating significant energy expenditure.
| Inlet Velocity (m/s) | Time to Reach t1 (s) | Coolant Flow Rate (m³/min) | Pressure Drop (Pa) |
|---|---|---|---|
| 2 | 556.00 | 0.15 | 8,649.31 |
| 3 | 558.25 | 0.23 | 19,450.50 |
| 5 | 560.00 | 0.38 | 54,120.75 |
| 8 | 561.35 | 0.60 | 135,899.62 |
The limitations of the conventional approach necessitate an optimized cooling strategy for EV battery packs. We propose a dual cold plate system with independent, parallel cooling channels. This design includes a bottom cold plate with 26 rectangular channels (12 mm × 34 mm) aligned with each column of cells, and a vertical cold plate positioned between columns, featuring 25 circular channels (10 mm diameter). Each channel is equipped with a proportional valve for individual flow control, allowing targeted cooling of abnormal cells without affecting others. In normal operation, all channels receive a baseline flow. When a cell, such as cell 63, shows signs of thermal runaway, its corresponding channels in both the bottom and vertical plates receive increased flow via valve adjustment. This modular approach enhances cooling efficiency by reducing channel length (minimizing coolant pre-heating) and containing heat spread through the vertical plates.
For the optimized EV battery pack, we set baseline inlet velocities at 0.50 m/s for channels cooling normal cells and 0.70 m/s for channels cooling cell 63. The vertical plates specifically mitigate lateral heat diffusion, protecting adjacent cells. Simulation results demonstrate a substantial improvement: cell 63 now reaches t1 after 604.25 seconds, extending the heat accumulation phase by 42.90 seconds compared to the best conventional case. Additionally, the total coolant consumption is reduced to 0.53 m³/min, an 11.67% saving. Pressure drops in individual channels remain low, as shown below, indicating lower pumping power requirements and enhanced energy efficiency for the EV battery pack thermal management system.
| Channel Type | Inlet Velocity (m/s) | Pressure Drop (Pa) |
|---|---|---|
| Bottom Plate (Normal Cell) | 0.50 | 101.28 |
| Vertical Plate (Normal Cell) | 0.50 | 210.92 |
| Bottom Plate (Abnormal Cell 63) | 0.70 | 356.69 |
| Vertical Plate (Abnormal Cell 63) | 0.70 | 788.36 |
The temperature distribution in the optimized EV battery pack shows a confined hotspot around cell 63, with minimal thermal propagation to neighboring columns. This is attributed to the vertical cold plates acting as thermal barriers. The extended heat accumulation time provides a larger safety window for fault detection and intervention, crucial for EV battery pack operations in dynamic environments. Furthermore, by avoiding overcooling of normal cells, the optimized system promotes uniform aging and prolongs the overall pack lifetime. The use of proportional valves enables adaptive control, making this strategy scalable for various EV battery pack configurations and thermal scenarios.
From an engineering perspective, the heat transfer enhancement can be quantified using the Nusselt number correlation for forced convection. For a channel flow, the average Nusselt number is given by:
$$ Nu = \frac{h D_h}{\lambda_c} $$
where \( D_h \) is the hydraulic diameter and \( \lambda_c \) is the coolant thermal conductivity. In our optimized design, the reduced channel length increases the local Reynolds number, thereby improving heat transfer coefficients. Additionally, the overall thermal resistance of the EV battery pack can be modeled as a series-parallel network. The total resistance \( R_{total} \) includes conduction through cell materials and convection to coolant:
$$ R_{total} = R_{cond} + R_{conv} $$
where \( R_{cond} = \frac{L}{\lambda A} \) and \( R_{conv} = \frac{1}{h A} \), with \( L \) being thickness and \( A \) being area. The optimized system lowers \( R_{conv} \) via targeted flow increase, while maintaining moderate \( R_{cond} \).
Comparative analysis highlights the efficiency gains. The performance metrics are summarized below, emphasizing the benefits for EV battery pack safety and energy consumption:
| Metric | Conventional Design (8 m/s) | Optimized Design | Improvement |
|---|---|---|---|
| Time to t1 (s) | 561.35 | 604.25 | +42.90 s (7.64%) |
| Coolant Flow Rate (m³/min) | 0.60 | 0.53 | -0.07 m³/min (11.67%) |
| Max Pressure Drop (Pa) | 135,899.62 | 788.36 | ~99.4% reduction |
| Thermal Spread Containment | Moderate | High | Enhanced |
In practical applications, such as in electric vehicles, the optimized cooling system can integrate with battery management systems (BMS) for real-time monitoring and control. By detecting temperature anomalies early, the BMS can adjust proportional valves to direct coolant flow, potentially preventing thermal runaway altogether. This proactive approach is vital for high-performance EV battery packs subjected to rapid charging or extreme ambient conditions. Moreover, the modular design facilitates maintenance and scalability, allowing customization for different EV models or energy storage installations.
Future work could explore multi-physics simulations coupling electrochemical-thermal models to better predict heat generation under various loads. Additionally, experimental validation on a prototype EV battery pack would strengthen the findings. The principles developed here—targeted cooling, independent channel control, and dual-plate geometry—are transferable to other battery chemistries and form factors, underscoring their versatility in advancing EV battery pack technology.
In conclusion, thermal runaway remains a critical challenge for lithium-ion batteries, especially in dense configurations like EV battery packs. Through numerical simulation, we have demonstrated that conventional single-channel liquid cooling is inefficient for delaying heat accumulation. Our optimized design, featuring independent channels with proportional valves and dual cold plates, significantly extends the time to critical temperature by 42.90 seconds while reducing coolant consumption by 11.67% and minimizing pressure drops. This strategy not only enhances safety by containing thermal spread but also improves energy efficiency and battery longevity. As the EV industry evolves, such intelligent thermal management systems will be indispensable for reliable and sustainable energy storage solutions.