In the context of increasing fossil energy shortages and environmental pollution, the large-scale adoption of sustainable green transportation in the automotive sector has become a crucial measure for energy conservation and emission reduction. Consequently, new energy vehicles have experienced rapid development in recent years. Lithium-ion batteries are widely used in the new energy automotive industry due to their high energy density, long cycle life, and mature technological applications. However, lithium-ion batteries generate heat during operation, and their optimal operating temperature range is between 25°C and 40°C, with a temperature difference within the battery pack requiring less than 5°C. When the temperature of lithium-ion batteries deviates from this optimal range, issues such as lithium dendrite growth and unstable solid electrolyte interfaces can arise, affecting battery safety and lifespan. Temperature non-uniformity within the battery pack can impact overall performance and longevity, while high temperatures can accelerate battery aging and even trigger thermal runaway, posing serious risks to passenger safety. Therefore, the thermal management of lithium-ion batteries is directly related to their safety, performance, and lifespan, necessitating the implementation of an appropriate Battery Thermal Management System (BTMS) to monitor and regulate the temperature of the battery pack, ensuring safe, stable, and efficient operation.
Depending on the cooling medium, battery cooling methods primarily include air cooling, liquid cooling, heat pipe cooling, and Phase Change Material (PCM) cooling. Air cooling utilizes airflow over the battery surface to dissipate heat, offering simplicity and low cost, but its cooling capacity is limited, making it suitable mainly for early-stage new energy vehicles. Heat pipe cooling relies on phase change of a working medium within the heat pipe for heat conduction, providing good cooling efficiency but facing challenges such as complex structure and high cost. PCM cooling utilizes the energy stored and released during phase change to maintain the battery pack temperature within the optimal range, effectively improving temperature uniformity; however, it suffers from poor thermal conductivity, unstable performance, and immaturity in commercialization. Liquid cooling uses a liquid as the cooling medium, exchanging heat through direct or indirect contact with the lithium-ion battery to achieve cooling. It is categorized into direct contact and indirect contact liquid cooling. Although more complex in structure, liquid cooling offers high efficiency and broad applicability, making it the mainstream cooling method in current applications.

In indirect contact liquid cooling, heat exchange is achieved by placing liquid cooling plates on the battery surface. This approach is a primary cooling method for major automotive manufacturers and a key focus of current research. The effectiveness of liquid cooling technology hinges on increasing the effective cooling area, so recent domestic and international studies have concentrated on cooling channel design, liquid cooling plate structural optimization, and coolant channel parameter optimization. For instance, research has examined the effects of coolant mass flow rate, number of liquid cooling plates, channel distribution, and coolant flow direction on cooling performance, demonstrating that rational coolant flow direction and liquid cooling plate placement can significantly enhance the heat dissipation effectiveness of the BTMS. Other studies have designed square spiral ring-shaped liquid cooling plates, investigating the impact of channel turns, width, and bending radius on BTMS cooling performance and pressure drop. Improved liquid cooling plates based on traditional serpentine channels have been compared, and further analysis of channel count, width, angle, and spacing on heat transfer and pressure drop characteristics has been conducted. Designs with coolant inlets on the same side versus opposite sides have been explored to study the influence of coolant flow direction on cooling performance, revealing that while maximum battery temperature differences are minimal, opposite-side inlets yield better temperature uniformity and more pronounced cooling performance at high discharge rates. Honeycomb-structured liquid cooling plates with dense channels that greatly increase effective cooling area have been designed, with studies on honeycomb structural parameters’ effects on cooling performance and optimal designs through comparative experiments. Topology optimization methods have been employed to design novel cooling channels, showing superior cooling performance compared to traditional serpentine channels. However, challenges remain in liquid cooling performance under high-temperature, high-discharge-rate conditions for battery packs. Hence, this paper proposes an innovative corrugated liquid cooling BTMS, validates the accuracy and reliability of the research methodology through experiments, investigates the effects of channel depth and liquid cooling plate wall thickness on system cooling performance and pressure drop, optimizes the liquid cooling plate structure, and analyzes the specific impacts of coolant flow rate and temperature on system cooling performance and power consumption for the optimized structure, determining critical coolant flow rates and temperatures that meet cooling requirements.
The geometric model in this study is based on a lithium iron phosphate battery used in a domestic pure electric vehicle. Key parameters are summarized in Table 1. The battery pack consists of 10 series-connected cells, with silicone pads placed between each adjacent pair. The liquid cooling plate is made of aluminum with dimensions of 158 mm × 7 mm × 308 mm. Internal cooling channels have a cross-sectional size of 5 mm × 4 mm. The coolant is a 50% ethylene glycol-water solution by volume. The overall battery thermal management system is illustrated in the figure above.
| Parameter Name | Value |
|---|---|
| Rated Capacity (Ah) | 40 |
| Rated Voltage (V) | 3.2 |
| Charge Cut-off Voltage (V) | 3.65 |
| Discharge Cut-off Voltage (V) | 2.3 |
| Charge Operating Temperature (°C) | 0–45 |
| Discharge Operating Temperature (°C) | -20–60 |
The heat generation rate of the battery is a critical parameter for studying the cooling performance of the battery thermal management system, but it is difficult to measure accurately in practice due to influences from internal resistance, temperature, and other factors. This paper employs the widely used Bernardi model for calculation, with the mathematical expression as follows:
$$ q = \frac{I}{V_b} \left[ (U_{ocv} – U) + T \frac{dU_{ocv}}{dT} \right] $$
where \( I \) is the discharge current in amperes (A), \( V_b \) is the volume of a single cell in cubic meters (m³), \( U_{ocv} \) is the open-circuit voltage in volts (V), \( U \) is the operating voltage in volts (V), \( T \) is the battery temperature in kelvin (K), and \( \frac{dU_{ocv}}{dT} \) is the entropy coefficient in volts per kelvin (V/K).
The environmental temperature is set to 30°C, with initial temperatures of the battery, liquid cooling plate, and silicone pads consistent with the environment. Heat transfer between the entire battery thermal management system and air occurs via natural convection, with a convective heat transfer coefficient of 5 W/(m²·K). The coolant inlet is a mass flow inlet with a flow rate of 1 g/s, and the outlet is a pressure outlet with an outlet pressure of 0 Pa. The initial coolant temperature is 20°C. Thermophysical parameters of the battery and materials involved are listed in Table 2.
| Parameter | Battery | Liquid Cooling Plate | Coolant (20°C) | Thermal Conductive Silicone |
|---|---|---|---|---|
| Density (kg/m³) | 2108 | 2719 | 1073.35 | 2000 |
| Specific Heat Capacity (J/(kg·K)) | 1036 | 871 | 3281 | 900 |
| Thermal Conductivity (W/(m·K)) | \( K_x = K_y = 29.1 \), \( K_z = 1.05 \) | 202.4 | 0.38 | 2 |
| Dynamic Viscosity (Pa·s) | – | – | 0.00394 | – |
Grid independence verification is essential as grid quantity affects calculation accuracy. Under the aforementioned boundary conditions and a 2C discharge rate, the maximum temperature, average temperature, and maximum temperature difference of the battery pack are used as evaluation indicators. Six grid quantity schemes are proposed: 501,758; 697,020; 1,064,301; 1,343,112; 1,559,021; and 1,714,179. Results show that when the grid quantity reaches 1,343,112, these parameters stabilize with minimal fluctuations, so this scheme is selected for subsequent calculations to balance precision and speed.
To validate the accuracy and reliability of the research methodology, experimental studies on the thermal characteristics of a single lithium-ion battery are conducted. Equipment includes a constant temperature and humidity chamber, battery charge-discharge device, data recorder, and temperature sensors. After experiments, results are compared with numerical calculations, showing agreement within a maximum error of 2%, confirming the reliability and accuracy of the method for studying BTMS thermal performance.
To investigate the effect of channel depth on BTMS performance, with a channel width of 5 mm, liquid cooling plate wall thickness of 1.5 mm, coolant flow rate of 1 g/s, coolant temperature of 20°C, and environmental temperature of 30°C, channel depths ranging from 3 mm to 8 mm are set: 3 mm, 4 mm, 5 mm, 6 mm, 7 mm, and 8 mm. Under a 2C discharge rate, temperature variations of the battery pack at discharge end and BTMS pressure drop changes with channel depth are analyzed. As channel depth increases, the maximum and average temperatures of the battery pack decrease with similar magnitudes, while the maximum temperature difference and system pressure drop also decrease but with diminishing reductions. At an initial channel depth of 4 mm, the battery pack’s maximum temperature, average temperature, maximum temperature difference, and system pressure drop are 39.55°C, 38.8°C, 3.03°C, and 214 Pa, respectively. When channel depth increases to 8 mm, these values reduce by 0.47°C, 0.42°C, 0.44°C, and 156 Pa, respectively. Results indicate that increasing channel depth improves both cooling performance and power consumption of the battery thermal management system, with the most notable improvement in power consumption.
The cooling performance of the battery thermal management system is also influenced by the liquid cooling plate wall thickness. With a channel width of 5 mm, channel depth of 4 mm, coolant flow rate of 1 g/s, coolant temperature of 20°C, and environmental temperature of 30°C, six different wall thicknesses ranging from 1 mm to 5 mm are set: 1 mm, 1.5 mm, 2 mm, 3 mm, 4 mm, and 5 mm. As wall thickness increases, the maximum and average temperatures of the battery pack decrease, and the maximum temperature difference also decreases but with progressively smaller reductions. Compared to the initial wall thickness of 1.5 mm, when increased to 5 mm, the maximum temperature, average temperature, and maximum temperature difference of the battery pack decrease by 0.87°C, 0.71°C, and 0.96°C, respectively. Results show that increasing the wall thickness of the liquid cooling plate significantly enhances the cooling performance of the BTMS. Since the cooling channels remain unchanged, the system pressure drop is constant at 214 Pa.
From the above analyses, increasing channel depth or liquid cooling plate wall thickness leads to better cooling performance for the BTMS, with power consumption also decreasing as channel depth increases. Therefore, the optimized liquid cooling plate has a channel depth of 8 mm and a wall thickness of 5 mm. Performance comparison between the optimized and initial structures is summarized in Table 3.
| Structure | Maximum Temperature (°C) | Average Temperature (°C) | Maximum Temperature Difference (°C) | System Pressure Drop (Pa) |
|---|---|---|---|---|
| Initial Structure | 39.55 | 38.8 | 3.03 | 214 |
| Optimized Structure | 38.37 | 37.8 | 1.95 | 58 |
The optimized structure exhibits superior cooling performance and lower power consumption compared to the initial structure. Specifically, the maximum temperature, average temperature, maximum temperature difference, and system pressure drop of the battery pack are reduced by 1.18°C, 1°C, 1.08°C, and 156 Pa, respectively. This demonstrates that the optimized liquid cooling plate offers enhanced cooling performance and reduced power consumption, highlighting the importance of structural adjustments in battery thermal management system design.
The flow rate of the coolant directly impacts the cooling performance and system pressure drop of the BTMS. To study this effect and determine the critical coolant flow rate that meets cooling requirements, using the optimized structure with an environmental temperature of 30°C and coolant temperature of 20°C, coolant flow rates are set to 0.4 g/s, 0.6 g/s, 0.8 g/s, 1 g/s, and 1.2 g/s. Under a 2C discharge rate, variations in battery pack temperature and BTMS pressure drop with different coolant flow rates are analyzed. As coolant flow rate increases, the maximum and average temperatures of the battery pack decrease, while the maximum temperature difference and system pressure drop gradually increase, though the rate of increase in maximum temperature difference diminishes. Results indicate that increasing coolant flow rate effectively lowers battery temperature but raises maximum temperature difference and system pressure drop, necessitating a balanced consideration of these factors when selecting an appropriate flow rate. At a coolant temperature of 20°C, the critical coolant flow rate meeting cooling requirements is 0.6 g/s, where the battery pack’s maximum temperature, average temperature, maximum temperature difference, and system pressure drop are 39.96°C, 39.41°C, 1.8°C, and 35 Pa, respectively.
The temperature of the coolant also significantly affects the cooling performance and system pressure drop of the BTMS. To further investigate this and determine the critical coolant temperature that meets cooling requirements, using the optimized structure with an environmental temperature of 30°C and coolant flow rate of 1 g/s, five different coolant temperatures ranging from 10°C to 30°C are set: 10°C, 15°C, 20°C, 25°C, and 30°C. Under a 2C discharge rate, calculations for battery pack temperature and BTMS pressure drop at different coolant temperatures are performed. As coolant temperature increases, the maximum and average temperatures of the battery pack rise, the maximum temperature difference decreases, and the dynamic viscosity of the coolant decreases with temperature, leading to a reduction in BTMS pressure drop. Results show that increasing coolant temperature effectively reduces the maximum temperature difference and BTMS pressure drop but elevates battery temperature. When coolant temperature rises from 10°C to 30°C, the maximum and average temperatures of the battery pack increase by 5.17°C and 5.42°C, respectively. At a coolant flow rate of 1 g/s, the critical coolant temperature meeting cooling requirements is approximately 25°C, where the battery pack’s maximum temperature, average temperature, maximum temperature difference, and system pressure drop are 39.67°C, 39.16°C, 1.69°C, and 50 Pa, respectively.
In conclusion, this paper proposes a corrugated liquid cooling battery thermal management system, investigates the effects of channel depth and liquid cooling plate wall thickness on BTMS performance, and optimizes the liquid cooling plate structure. For the optimized structure, the impacts of coolant flow rate and temperature on BTMS cooling performance and power consumption are analyzed. Key findings are as follows: First, increasing channel depth or liquid cooling plate wall thickness reduces the maximum temperature, average temperature, and maximum temperature difference of the battery pack, while system pressure drop decreases with increased channel depth. Second, after structural optimization, the maximum temperature, average temperature, maximum temperature difference, and system pressure drop of the battery pack are reduced by 1.18°C, 1°C, 1.08°C, and 156 Pa, respectively, indicating that appropriate structural adjustments can significantly improve both cooling performance and power consumption of the BTMS. Third, increasing coolant flow rate or decreasing coolant temperature lowers the maximum and average temperatures of the battery pack but increases the maximum temperature difference and system pressure drop, so practical applications require balancing cooling performance and power consumption to select suitable coolant flow rates and temperatures. Fourth, at a coolant temperature of 20°C, the critical coolant flow rate meeting cooling requirements is 0.6 g/s; at a coolant flow rate of 1 g/s, the critical coolant temperature is approximately 25°C. These insights contribute to the design and optimization of battery thermal management systems, ensuring efficient and safe operation of lithium-ion batteries in demanding conditions.
The battery management system (BMS) plays a pivotal role in monitoring and controlling the thermal state of lithium-ion batteries. By integrating advanced cooling strategies like the corrugated liquid cooling channel, the BMS can effectively regulate temperature distribution, prevent thermal runaway, and extend battery lifespan. Future work could explore adaptive control algorithms within the BMS to dynamically adjust coolant flow and temperature based on real-time thermal loads, further optimizing energy efficiency and performance. Additionally, coupling the BMS with predictive maintenance models could enhance reliability and safety in electric vehicle applications. Continued research in battery thermal management systems is essential for advancing sustainable transportation and meeting the growing demands of the new energy automotive industry.
