In the pursuit of sustainable transportation, the shift toward new energy vehicles, particularly those powered by lithium-ion batteries, has become paramount. As a researcher focused on thermal management, I recognize that the performance, safety, and longevity of these batteries are critically dependent on maintaining optimal operating conditions. The battery management system (BMS) plays a pivotal role in this regard, with thermal management being a core component. Among various cooling strategies, liquid cooling has emerged as a highly effective solution due to its superior heat dissipation capabilities and compact design. However, traditional liquid-cooled plates, especially those with serpentine flow channels, often suffer from non-uniform cooling rates and high-pressure drops, which can compromise the efficiency of the battery management system. In this study, we address these challenges by proposing a novel design methodology that combines structural optimization of serpentine channels with topology optimization, aiming to enhance the thermal performance and fluid dynamics of cooling plates for battery management systems.
The importance of an efficient battery management system cannot be overstated. Lithium-ion batteries operate best within a temperature range of 20–40°C, with temperature variations across a battery pack ideally kept below 5°C to prevent thermal runaway and degradation. In my work, I focus on improving the liquid cooling aspect of the BMS, as it is widely adopted in the industry for its ability to handle high heat loads. Traditional serpentine flow channels, while offering increased surface area for heat transfer, are often designed empirically, leading to inefficiencies such as redundant bends and high fluid resistance. This not only reduces cooling uniformity but also increases the energy consumption of the battery management system. Therefore, my objective is to develop an optimized serpentine cooling plate that leverages advanced computational methods to achieve better thermal equilibrium and lower pressure drops, ultimately contributing to a more reliable battery management system.
To begin, I established a theoretical foundation based on thermal simulation. The heart of this analysis lies in modeling the heat generation of lithium-ion batteries. We employ the Bernardi heat generation model, which is widely used in BMS research to estimate the heat production rate during battery operation. The model is expressed as:
$$Q = I \left( (U_{ocv} – U) + T \frac{\partial U_{ocv}}{\partial T} \right)$$
where \(Q\) is the heat generation rate in watts, \(I\) is the operating current in amperes, \(U_{ocv}\) is the open-circuit voltage, \(U\) is the battery voltage, and \(T\) is the temperature. For a lithium iron phosphate battery with a capacity of 30 Ah, we derived the heat flux under different discharge rates. This is crucial for simulating thermal loads in the battery management system. Under adiabatic conditions, the heat generation and absorption can be simplified, leading to a linear relationship between temperature change rate and current. Through experimental data fitting, we obtained:
$$\frac{1}{I} \frac{dT}{dt} = 4.4354 \times 10^{-6} I + 1.426 \times 10^{-4}$$
From this, we calculated the equivalent specific heat capacity and derived the heat generation power. For instance, at a 3C discharge rate, the volumetric heat generation rate \(q\) and surface heat flux \(q_1\) are determined as follows:
$$q = \frac{Q}{V} = 6.768I^2 + 223I$$
$$q_1 = \frac{Q}{L \times K} = 6.768I^2 H + 223IH$$
where \(V\) is the battery volume, and \(L\), \(K\), and \(H\) are the length, width, and height, respectively. These values serve as inputs for our simulations, ensuring that the battery management system is tested under realistic conditions. Table 1 summarizes the heat generation parameters for different discharge rates, which are essential for designing the cooling plate in the BMS.
| Discharge Rate (C) | Volumetric Heat Generation Rate (W/m³) | Surface Heat Flux (W/m²) |
|---|---|---|
| 1 | 12,781 | 141 |
| 2 | 37,745 | 415 |
| 3 | 74,891 | 824 |
Next, I designed the serpentine cooling plate geometry. The plate dimensions are 202 mm × 133 mm × 3 mm, corresponding to a typical battery size in electric vehicles. The flow channels are initially set with a width of 9 mm, height of 2 mm, and a number of 9 channels, spaced 20 mm apart. Aluminum is chosen as the plate material due to its high thermal conductivity, and water is used as the coolant, as it is common in battery management systems. The three-dimensional model of the serpentine plate is constructed, and simulation assumptions are made to simplify the analysis: the plate is homogeneous and isotropic, fluid properties are constant, gravity and viscous losses are negligible, and apart from the contact surfaces with the battery, all other surfaces are adiabatic. These assumptions align with standard practices in BMS design to focus on key thermal and fluid dynamics effects.
To ensure accurate simulations, we performed a grid independence test. Different mesh sizes were evaluated, ranging from 8 mm to 3 mm, corresponding to grid counts from 165,981 to 1,198,504. The results showed that variations in maximum temperature, temperature difference, and pressure drop were below 5% for mesh sizes of 7 mm or finer. Therefore, a mesh size of 7 mm (271,104 grids) was selected for all subsequent simulations, balancing computational efficiency and accuracy. This step is vital for reliable predictions in the battery management system context, as inaccurate meshing can lead to misleading thermal profiles.
The flow regime is determined by calculating the Reynolds number (\(Re\)), which distinguishes between laminar and turbulent flow. For the serpentine channels, the Reynolds number is given by:
$$Re = \frac{\rho u D_h}{\mu}$$
where \(\rho\) is the fluid density, \(u\) is the velocity, \(D_h\) is the hydraulic diameter, and \(\mu\) is the dynamic viscosity. In our design, \(Re\) is found to be less than 2,300, indicating laminar flow. This justifies the use of laminar flow models in our simulations, which is typical for such cooling plates in battery management systems. The physical properties of aluminum and water are listed in Table 2, as they are key inputs for the thermal and fluid simulations.
| Material | Specific Heat Capacity (J/(kg·K)) | Density (kg/m³) | Thermal Conductivity (W/(m·K)) | Dynamic Viscosity (Pa·s) |
|---|---|---|---|---|
| Water | 4,182.72 | 997.561 | 0.620271 | 8.8871 × 10⁻⁴ |
| Aluminum | 903 | 2,702 | 237 | – |
With the model established, I proceeded to optimize the serpentine flow channel structure. Traditional designs often rely on trial-and-error, but here, we employ a systematic approach. First, a single-factor analysis revealed that channel width, height, and number significantly impact cooling performance. To explore these factors efficiently, an orthogonal experimental design was adopted. We selected three factors: channel width (A), channel height (B), and channel number (C), each at four levels, as shown in Table 3. This design reduces the number of experiments while capturing the effects of each factor, which is essential for optimizing the battery management system components.
| Level | Channel Width (A) / mm | Channel Height (B) / mm | Channel Number (C) |
|---|---|---|---|
| 1 | 9 | 1.0 | 9 |
| 2 | 10 | 1.5 | 10 |
| 3 | 11 | 2.0 | 11 |
| 4 | 12 | 2.5 | 12 |
A total of 16 simulation runs were conducted based on the \(L_{16}(4^3)\) orthogonal array. The performance metrics evaluated include maximum temperature (\(T_{max}\)), surface temperature difference (\(\Delta T\)), and pressure drop (\(\Delta P\)). The results are summarized in Table 4. From this data, we performed range analysis to determine the influence of each factor. The range values indicate that channel number has the most significant effect on \(T_{max}\), while channel height greatly affects \(\Delta P\). This insight guides the optimization process for the battery management system, as minimizing temperature and pressure drop are key goals.
| Experiment No. | A (mm) | B (mm) | C | \(T_{max}\) (°C) | \(\Delta T\) (°C) | \(\Delta P\) (Pa) |
|---|---|---|---|---|---|---|
| 1 | 9 | 1.0 | 9 | 30.637 | 1.432 | 2834.961 |
| 2 | 9 | 1.5 | 10 | 30.906 | 1.411 | 1052.112 |
| 3 | 9 | 2.0 | 11 | 31.124 | 1.418 | 545.543 |
| 4 | 9 | 2.5 | 12 | 31.356 | 1.437 | 326.283 |
| 5 | 10 | 1.0 | 10 | 30.611 | 1.433 | 2570.056 |
| 6 | 10 | 1.5 | 9 | 30.918 | 1.404 | 840.383 |
| 7 | 10 | 2.0 | 12 | 31.073 | 1.430 | 509.141 |
| 8 | 10 | 2.5 | 11 | 31.386 | 1.436 | 256.895 |
| 9 | 11 | 1.0 | 11 | 30.609 | 1.441 | 2451.876 |
| 10 | 11 | 1.5 | 12 | 30.843 | 1.434 | 939.328 |
| 11 | 11 | 2.0 | 9 | 31.115 | 1.409 | 349.669 |
| 12 | 11 | 2.5 | 10 | 31.372 | 1.433 | 207.499 |
| 13 | 12 | 1.0 | 12 | 30.696 | 1.476 | 242.978 |
| 14 | 12 | 1.5 | 11 | 30.841 | 1.433 | 779.566 |
| 15 | 12 | 2.0 | 10 | 31.045 | 1.417 | 339.183 |
| 16 | 12 | 2.5 | 9 | 31.340 | 1.427 | 168.952 |
To further refine the design, we employed a multi-objective optimization algorithm. The goal is to minimize \(T_{max}\), \(\Delta T\), and \(\Delta P\) simultaneously, which are often conflicting objectives in battery management system design. We use the Non-dominated Sorting Genetic Algorithm II (NSGA-II), a popular method for such problems. First, surrogate models are constructed based on the orthogonal experimental data to approximate the relationships between design variables and objectives. Using multiple linear regression, the fitted models are:
$$T_{max}(A,B,C) = 32.44195 + 0.00115A^2 + 0.036425B^2 + 0.00529C^2 – 0.22204A + 1.3127B – 0.31271C – 0.04611AB + 0.02389AC – 0.04145BC$$
$$\Delta T(A,B,C) = 2.17009 + 0.00179A^2 + 0.03993B^2 + 0.00157C^2 – 0.07591A – 0.10179B – 0.06296C – 0.000925AB + 0.00411AC – 0.00272BC$$
$$\Delta P(A,B,C) = -28175.31579 – 114.70430A^2 + 926.14379B^2 – 124.43973C^2 + 3906.18841A – 15656.19318B + 4441.81793C + 571.80331AB – 241.87517AC + 458.18578BC$$
The coefficient of determination (\(r^2\)) for these models exceeds 0.95, indicating good accuracy. The optimization problem is formulated as:
$$\text{minimize } F(A,B,C) = [F_1(A,B,C), F_2(A,B,C), F_3(A,B,C)]$$
subject to constraints: \(9 \leq A \leq 12\), \(1 \leq B \leq 2.5\), \(9 \leq C \leq 12\). We set the NSGA-II parameters: population size 100, generations 100, crossover probability 0.8, and mutation probability 0.04. The Pareto optimal front reveals trade-offs between the objectives. Analysis of the optimal solutions shows that a channel width of 12 mm, height of 2 mm, and number of 9 channels appear frequently. We select this combination as the optimized serpentine design. Validation with numerical simulations confirms that errors are below 3%, as shown in Table 5. This optimized design forms the initial solution for the subsequent topology optimization, a key innovation in our approach to enhancing the battery management system.
| Performance Metric | Numerical Simulation | Surrogate Model Prediction | Error (%) |
|---|---|---|---|
| \(T_{max}\) (°C) | 31.079 | 31.058 | 0.068 |
| \(\Delta T\) (°C) | 1.413 | 1.407 | 0.461 |
| \(\Delta P\) (Pa) | 310.787 | 318.564 | 2.471 |
With the optimized serpentine channel as a starting point, we now delve into topology optimization. This method allows for material distribution within a design domain to achieve optimal performance, going beyond parametric tweaks. For the battery management system, topology optimization can generate novel flow channel patterns that improve heat exchange and reduce pressure drop. The theoretical basis involves the variable density method, where each element’s density (\(\gamma\)) ranges from 0 (solid) to 1 (fluid). The governing equations include the dimensionless Navier-Stokes equations for fluid flow and heat transfer. The dimensionless momentum equation is:
$$(\mathbf{u}^* \cdot \nabla^*) \mathbf{u}^* = \nabla^* \left[ -p^* \mathbf{I} + \frac{1}{Re} (\nabla^* \mathbf{u}^* + (\nabla^* \mathbf{u}^*)^T) \right] – \alpha^* \mathbf{u}^*$$
where \(\mathbf{u}^*\) is dimensionless velocity, \(p^*\) is dimensionless pressure, \(Re\) is Reynolds number, and \(\alpha^*\) is dimensionless permeability interpolated via:
$$\alpha^*(\mathbf{x}) = \alpha^*_{\text{max}} \frac{q(1 – \gamma)}{q + \gamma}$$
with \(q = 0.01\) and \(\alpha^*_{\text{max}} = (1 + \frac{1}{Re}) \frac{1}{Da}\), where \(Da\) is the Darcy number set to 0.0001. The heat transfer equation is:
$$\gamma Re Pr (\mathbf{u}^* \cdot \nabla^*) T^* = \nabla^{*2} T^* + (1 – \gamma) h^* (1 – T^*)$$
where \(Pr\) is the Prandtl number, and \(h^*\) is the dimensionless heat transfer coefficient. To mitigate mesh dependency, we apply a Helmholtz filter and a hyperbolic tangent projection:
$$\tilde{\gamma} = \frac{\tanh(\beta (\gamma – \gamma_\beta)) + \tanh(\beta \gamma_\beta)}{\tanh(\beta (1 – \gamma_\beta)) + \tanh(\beta \gamma_\beta)}$$
The objective function combines maximizing heat exchange and minimizing fluid dissipation power:
$$J = -w_1 J_1 + w_2 J_2$$
where \(w_1\) and \(w_2\) are weight coefficients summing to 1, \(J_1 = \int_\Omega (1 – \gamma) h^* (1 – T^*) d\Omega\) represents heat exchange, and \(J_2 = \int_\Omega [\nabla^* \mathbf{u}^* : (\nabla^* \mathbf{u}^* + (\nabla^* \mathbf{u}^*)^T) + \alpha^* \mathbf{u}^* \cdot \mathbf{u}^*] d\Omega\) represents dissipation power. The optimization problem is solved with a volume fraction constraint of 0.5 for the fluid domain.
We consider two initial designs: a uniform density field (uniform initial solution) and the optimized serpentine channel (serpentine initial solution). The design domain is sized 1.33L × 2.02L, with an inlet size of 0.12L. For the uniform case, convergence is achieved around 60 iterations, resulting in a branching channel structure. For the serpentine case, convergence takes about 260 iterations due to the complex geometry, but it retains aspects of the initial serpentine shape while evolving finer branches. Figure 1 illustrates the concept of topology optimization in battery management systems, showing how fluid paths can be optimized for better performance.

We explored different weight combinations (\(w_1, w_2\)) and Reynolds numbers. At \(Re = 150\) and equal weights (0.5, 0.5), the topology results show balanced channel distributions. As weights shift toward minimizing dissipation (\(w_2\) increases), channels become sparser and wider, reducing pressure drop but potentially affecting heat uniformity. Similarly, higher Reynolds numbers lead to more complex, branched structures. Based on these analyses, we select the topology-optimized design from the serpentine initial solution at \(Re = 150\) and equal weights, as it offers a good compromise for the battery management system.
The performance of the topology-optimized cooling plate is then compared with the original serpentine design. The results, summarized in Table 6, demonstrate significant improvements. The topology-optimized plate reduces the maximum temperature by 1.335%, the surface temperature difference by 1.982%, and the pressure drop by 93.476% compared to the original serpentine design. This highlights the efficacy of topology optimization in enhancing both thermal management and fluid efficiency, which are critical for the battery management system. The drastic reduction in pressure drop is particularly beneficial, as it lowers the pumping power required, thereby improving the overall energy efficiency of the BMS.
| Design | \(T_{max}\) (°C) | \(\Delta T\) (°C) | \(\Delta P\) (Pa) | Improvement in \(T_{max}\) (%) | Improvement in \(\Delta T\) (%) | Improvement in \(\Delta P\) (%) |
|---|---|---|---|---|---|---|
| Original Serpentine | 31.079 | 1.413 | 310.787 | – | – | – |
| Topology-Optimized (Serpentine Initial) | 30.664 | 1.385 | 20.28 | 1.335 | 1.982 | 93.476 |
| Topology-Optimized (Uniform Initial) | 30.671 | 1.387 | 20.31 | 1.312 | 1.840 | 93.467 |
To validate our simulations, we conducted experimental tests. A cooling plate was fabricated using aluminum 5052 with dimensions matching the optimized serpentine design. The experimental setup included a constant temperature bath, a precision peristaltic pump, flow meters, pressure gauges, and temperature sensors. Tests were performed at mass flow rates of 1, 2, and 3 g/s under a 3C discharge simulation. The results, compared in Table 7, show that the average temperature and pressure drop errors between simulation and experiment are within 3.42% and 6.36%, respectively. This confirms the reliability of our numerical models and the effectiveness of the optimized design for practical battery management system applications.
| Mass Flow Rate (g/s) | Average Temperature (°C) Simulation | Average Temperature (°C) Experiment | Error (%) | Pressure Drop (Pa) Simulation | Pressure Drop (Pa) Experiment | Error (%) |
|---|---|---|---|---|---|---|
| 1 | 32.13 | 31.03 | 3.42 | 10.53 | 9.86 | 6.36 |
| 2 | 28.84 | 28.08 | 2.60 | 20.28 | 19.45 | 4.09 |
| 3 | 27.75 | 27.26 | 1.77 | 35.52 | 34.17 | 3.80 |
In discussion, the integration of structural optimization and topology optimization presents a powerful approach for advancing battery management system technology. The optimized serpentine channel, derived from orthogonal experiments and genetic algorithms, serves as a robust initial solution that captures traditional design wisdom. By applying topology optimization, we transcend conventional geometries, allowing the flow paths to evolve naturally based on physical laws. This results in a cooling plate that not only meets thermal uniformity requirements but also minimizes energy losses—a dual benefit that is highly desirable in battery management systems. The significant pressure drop reduction, exceeding 93%, is a testament to the efficiency gains achievable through this methodology. Furthermore, the use of surrogate models and multi-objective algorithms ensures that the design is Pareto-optimal, balancing competing objectives effectively. These advancements contribute to more compact, efficient, and reliable battery management systems, which are essential for the widespread adoption of electric vehicles.
Looking ahead, there are several avenues for further research. For instance, the topology optimization framework could be extended to include transient thermal analyses or multi-physics couplings, such as electro-thermal effects, to better simulate real-world battery management system operations. Additionally, exploring different coolant types or incorporating phase-change materials could yield further performance improvements. The methodology presented here is not limited to serpentine channels; it can be adapted to other cooling plate configurations, making it a versatile tool for BMS designers. As battery technologies evolve toward higher energy densities, the demand for efficient thermal management will only grow, underscoring the importance of continued innovation in this field.
In conclusion, this study demonstrates a comprehensive design and optimization strategy for serpentine cooling plates in battery management systems. We started by modeling battery heat generation and designing a baseline serpentine channel. Through orthogonal experiments and NSGA-II optimization, we identified an optimal channel geometry with a width of 12 mm, height of 2 mm, and 9 channels. This design was then used as an initial solution for topology optimization, which generated novel flow paths that significantly enhance thermal performance and reduce pressure drop. The final topology-optimized plate achieves a 1.335% reduction in maximum temperature, a 1.982% reduction in surface temperature difference, and a remarkable 93.476% reduction in pressure drop compared to the original design. Experimental validation confirms the accuracy of our simulations. This work provides a new reference for designing high-performance cooling plates, contributing to the development of more efficient and reliable battery management systems. By leveraging advanced computational techniques, we can overcome the limitations of traditional designs and pave the way for next-generation thermal management solutions in electric vehicles.
