Comprehensive Analysis of Discharge Capacity and DC Internal Resistance in High-Energy EV Battery Packs

As the global automotive industry rapidly transitions toward electrification, the performance and reliability of electric vehicle (EV) battery systems have become paramount. In recent years, EV adoption has surged, with new energy vehicle sales consistently exceeding market share thresholds. The heart of an electric vehicle is its EV battery pack, which directly determines driving range, power output, safety, and overall longevity. Among the critical parameters defining battery health and performance are discharge capacity and internal resistance. Discharge capacity dictates the energy available for vehicle propulsion, while internal resistance, particularly DC internal resistance (DCR), serves as a key indicator of battery aging, consistency, and thermal behavior. Understanding how environmental factors, especially temperature, influence these characteristics is essential for optimizing EV battery pack design, management systems, and operational strategies. This study focuses on a high-energy ternary lithium-ion (NMC) EV battery pack system, investigating its discharge capacity, temperature rise, and DCR under varied ambient conditions. The insights gained are crucial for enhancing battery performance estimation, fault diagnosis, and lifespan prediction in real-world EV applications.

We selected a high-energy ternary lithium-ion EV battery pack system as the primary research object. This system comprises the battery pack本体, a battery management system (BMS), and an integrated thermal exchange system. The fundamental performance parameters of the EV battery pack are summarized in Table 1. These specifications highlight the pack’s design for energy density and operational range, typical of modern EV applications.

Table 1: Performance Parameters of the High-Energy Ternary Lithium-Ion EV Battery Pack
Name Unit Parameter
Product Dimensions mm 1563 × 1258 × 272
Nominal Capacity Ah 195
Nominal Energy kWh 68.8
Rated Voltage V 353.3
Discharge Rate C 1/3
Operating Temperature °C –30 to 55
Total Weight kg 376.4

The experimental setup was designed to precisely control and measure the EV battery pack behavior. The core testing platform is visually represented below, illustrating the integration of key components for comprehensive evaluation.

This platform consisted of the following major equipment: the ternary lithium EV battery pack, a power battery charge/discharge cabinet (model EVBT-1200-1000-2IOS), a walk-in environmental chamber (model TOFH-B12000FWL-5K) capable of generating precise temperature conditions, and a central computer for system control and data acquisition. The performance ranges of these critical devices are detailed in Table 2. Accurate measurement is fundamental; therefore, the error margins associated with all sensing and control equipment are documented in Table 3. This ensures the reliability and repeatability of all data presented in this analysis.

Table 2: Performance Parameters of Test Equipment
Equipment Name Performance Unit Parameter
Charge/Discharge Cabinet Voltage Range V 30 to 1200
Current Range A –1000 to 1000
Walk-in Environmental Chamber Temperature Range °C –70 to 150
Table 3: Measurement Error of Test Equipment
Parameter Unit Measurement Tool Error Parameter
Ambient Temperature °C Environmental Chamber ±2
Current A EVBT-1200-1000-2IOS 0.05% FS
Voltage V EVBT-1200-1000-2IOS 0.05% FS
Timer s EVBT-1200-1000-2IOS 0

Prior to any specific testing, a rigorous preconditioning procedure was essential to activate and stabilize the EV battery pack. This process involved multiple complete charge-discharge cycles at a standard temperature of 25°C. The charge phase used a constant current-constant voltage (CCCV) protocol: a constant current (CC) of 1/3C (approximately 65A based on nominal capacity) until any cell voltage reached 4.25V, followed by a constant voltage (CV) phase until the current dropped below 9.75A. The discharge phase was a CC discharge at 1/3C until the minimum cell voltage fell below 2.8V. This cycle was repeated. The preconditioning was considered complete when the discharge capacity change between two consecutive cycles was less than 3% of the nominal capacity. This ensured that the EV battery pack was in a reproducible and stable state for all subsequent comparative tests.

The discharge capacity and energy tests were conducted after preconditioning. The EV battery pack was fully charged using the CCCV method at 25°C. It was then placed in environmental chambers set to different target temperatures: 40°C, 25°C, 0°C, and –30°C. The pack was soaked until the core temperature of the cells was within 2°C of the ambient temperature. Subsequently, a constant current discharge at 1/3C rate was performed until the cutoff condition was met (cell voltage < 2.8V for temperatures ≥0°C, and < 2.1V for –30°C). The total discharged ampere-hours (Ah) and watt-hours (Wh) were recorded to determine capacity and energy at each temperature. This method directly reflects the usable energy of the EV battery pack under different climatic conditions, a critical factor for EV range estimation.

For DC internal resistance (DCR) testing, a pulse discharge method was employed. This is a standard technique for evaluating the ohmic and rapid polarization resistance of an EV battery pack. The test procedure began by adjusting the pack’s state of charge (SOC) to 50% at 25°C. The pack was then stabilized at different test temperatures: 40°C, 25°C, 0°C, and –20°C. A high-current pulse was applied: a constant current discharge at the maximum allowable pulse current of I’max = 400A for a duration of 12 seconds. Voltage and current were sampled at a high frequency (every 0.1s). The DC internal resistance at specific time points is calculated using Ohm’s law principle, comparing the instantaneous voltage drop from the moment just before the pulse. The formulas for resistance at 0.1 seconds and 2 seconds are fundamental:

$$R_{0.1} = \frac{U_0 – U_{0.1}}{I_{0.1}}$$

$$R_2 = \frac{U_0 – U_2}{I_2}$$

Here, \(U_0\) is the voltage immediately before the pulse (at t=0s), \(U_{0.1}\) and \(I_{0.1}\) are the voltage and current at 0.1 seconds, and \(U_2\) and \(I_2}\) are the values at 2 seconds. The resistance \(R_{0.1}\) captures the immediate ohmic response, while \(R_2\) includes a more stable polarization component. Monitoring DCR evolution over the pulse duration provides insights into the dynamic behavior of the EV battery pack.

The relationship between discharge voltage and discharged capacity at a constant 1/3C rate under four ambient temperatures is a key finding. The data reveals a strong dependence of performance on temperature. The delivered discharge capacities were quantitatively as follows: 193.595 Ah at 40°C, 193.436 Ah at 25°C, 185.121 Ah at 0°C, and 173.983 Ah at –30°C. This translates to capacity retention percentages relative to the nominal 195 Ah capacity. The capacity difference between 25°C and 40°C is negligible (only about 0.08% of nominal capacity), indicating stable high-temperature performance within this range. However, the low-temperature impact is severe: the capacity at 0°C is about 5% lower than at 25°C, and at –30°C, it plummets to approximately 10.8% lower than the 25°C baseline. This underscores the significant “range anxiety” issue for EVs in winter climates, directly linked to the reduced usable capacity of the EV battery pack.

The voltage profile during discharge further elucidates this behavior. At 25°C and 40°C, the voltage decreases smoothly and relatively linearly for most of the discharge process, with a steeper drop near the end-of-discharge cutoff. In contrast, at 0°C and especially at –30°C, the initial voltage drop is extremely sharp due to high internal resistance. Interestingly, at –30°C, a slight voltage recovery is observed after the initial plunge. This phenomenon is attributed to internal heating: the high current through the large internal resistance generates heat, which warms the battery cells, reduces the internal resistance, and consequently causes a temporary voltage rise. This self-heating effect is a crucial, albeit inefficient, mechanism in low-temperature operation of an EV battery pack. The voltage plateau thereafter is shorter and lower, leading to an earlier reach of the cutoff voltage and thus a lower total capacity.

Concurrent with capacity measurement, the temperature rise (\(\Delta T\)) of the EV battery pack during discharge was monitored. The temperature rise versus discharged capacity curves are highly informative. The results, summarized conceptually in Table 4, show a dramatic effect of ambient temperature on thermal behavior.

Table 4: Summary of Battery Pack Temperature Rise During Discharge at 1/3C Rate
Ambient Temperature (°C) Approx. Total Temperature Rise (K) Characteristics of Temperature Rise Curve
40 4.5 – 7.5 Gentle increase, low slope throughout.
25 4.5 – 7.5 Gentle increase, similar to 40°C.
0 ~15.5 Steep initial rise, then plateau, followed by a final increase.
–30 ~33.5 Very steep initial rise, significant heating throughout.

The fundamental reason is the relationship between heat generation and internal resistance. The heat generated (\(Q\)) during discharge can be approximated by \(Q = I^2 R t\), where \(I\) is the current, \(R\) is the internal resistance, and \(t\) is time. Since the internal resistance of the EV battery pack is much higher at low temperatures (as confirmed in the DCR tests), the heat generation rate is substantially higher during low-temperature discharge. At 25°C and 40°C, the internal resistance is low, so the ohmic heating is minimal, and the pack’s thermal management system can effectively dissipate the heat, resulting in a modest temperature rise. At sub-zero temperatures, the high resistance leads to intense internal heating, which actually aids in raising the cell temperature and improving performance mid-discharge, but at the cost of significant energy loss as heat rather than useful electrical work. This trade-off is central to thermal management strategies for EV battery pack systems.

The DC internal resistance tests provide direct, quantitative evidence of the temperature dependence of the EV battery pack impedance. The calculated DCR values over the 12-second pulse discharge at 50% SOC are plotted against time for the four test temperatures. The resistance values at a stable period (e.g., at t=10s) starkly illustrate the impact:

Table 5: DC Internal Resistance (at t=10s, SOC=50%) vs. Ambient Temperature
Ambient Temperature (°C) DC Internal Resistance (mΩ) Ratio Relative to 40°C Resistance
40 51.0 1.00
25 53.1 1.04
0 ~120 (estimated from curve) ~2.35
–20 213.0 4.18

The DCR at –20°C is more than four times higher than at 40°C. This exponential-like increase with decreasing temperature is attributed to several physicochemical factors within the lithium-ion cells: increased viscosity of the electrolyte, slowed diffusion of lithium ions in the electrode materials and electrolyte, and decreased charge transfer kinetics at the electrode-electrolyte interfaces. All these factors contribute to a rise in both ohmic resistance and activation polarization resistance. The shape of the DCR vs. time curve also differs. At low temperatures (0°C and –20°C), there is a very sharp increase in the first 1-2 seconds, after which the resistance stabilizes at a high value. This initial spike represents the immediate voltage drop under load, dominated by ohmic resistance. At higher temperatures (25°C and 40°C), the increase is more gradual and linear over time, reflecting a different balance between ohmic and slow polarization processes. This dynamic behavior is critical for designing battery management systems (BMS) that perform accurate state-of-health (SOH) and state-of-power (SOP) estimations for the EV battery pack.

To understand the long-term aging effects, a comparative analysis was conducted between a new, unused EV battery pack (the test sample) and a pack extracted from a durability test vehicle that had accumulated 40,000 kilometers of simulated driving. Both packs were identical in specification. The discharge capacity test at 25°C yielded 188.155 Ah for the aged pack, compared to 193.436 Ah for the new sample. This represents a capacity fade of approximately 2.7% from the sample and about 3.5% from the nominal capacity. Projecting this linear degradation trend (though aging is often non-linear) to the point where capacity reaches 70% of nominal suggests a potential service life exceeding 340,000 kilometers or 17 years, far surpassing typical automotive lifecycle targets. This demonstrates the inherent durability of a well-designed high-energy EV battery pack.

More revealing is the change in internal resistance. Under identical test conditions (25°C, 50% SOC), the DCR of the aged EV battery pack was measured at 55.372 mΩ, compared to 53.112 mΩ for the new sample. This constitutes an increase of approximately 4.3%. This correlation between capacity loss and increased internal resistance is a hallmark of lithium-ion battery aging. Mechanisms such as solid electrolyte interface (SEI) layer growth, electrode particle cracking, and loss of active material contribute simultaneously to increased impedance and reduced lithium inventory, manifesting as higher DCR and lower capacity. Monitoring the DCR of an EV battery pack in service can therefore be a powerful prognostic tool for estimating remaining useful life (RUL).

The comprehensive analysis leads to several overarching conclusions regarding the high-energy ternary lithium-ion EV battery pack. First, ambient temperature is a dominant external factor governing discharge performance. The usable capacity of the EV battery pack is maximized at moderate to high temperatures (25-40°C) and declines significantly as temperature drops, with a particularly severe reduction below 0°C. This is the fundamental technical reason behind the reduced driving range of EVs in cold weather. Second, the thermal behavior during discharge is intrinsically linked to internal resistance. Low-temperature operation forces the EV battery pack to generate substantial internal heat due to high impedance, leading to large temperature rises during discharge. This creates complex challenges for thermal management systems, which must balance the need to warm the pack for performance with the need to cool it for safety and longevity.

Third, the DC internal resistance is a highly sensitive and informative parameter. It exhibits a strong inverse relationship with temperature, increasing by a factor of four or more from 40°C down to –20°C. This has direct implications for vehicle power delivery: a cold EV battery pack will have higher voltage sag under load, limiting available power and regenerative braking capability. Fourth, long-term cycling leads to concurrent degradation in both capacity and internal resistance. The observed increase in DCR with mileage, though modest in this early-life stage, confirms its utility as a key metric for battery state-of-health assessment in deployed EV battery pack systems.

From an application perspective, these findings underscore the importance of sophisticated battery thermal management systems (BTMS) that actively regulate the EV battery pack temperature within an optimal window. Pre-conditioning the battery while connected to the grid in cold climates can mitigate range loss and protect the pack. Furthermore, BMS algorithms must dynamically adjust range and power estimates based on real-time pack temperature and resistance measurements. For second-life applications, such as stationary energy storage, sorting and grading retired EV battery pack modules based on their DCR and capacity characteristics will be essential for building reliable and safe repurposed systems.

Future work could involve expanding the test matrix to include more SOC points for DCR characterization, investigating the effects of different discharge rates (C-rates) on temperature rise and capacity, and conducting detailed post-mortem analysis on aged cells to correlate measured electrical parameter shifts with physical degradation mechanisms. Integrating this empirical data into electrochemical and thermal models would further enhance the predictive capabilities for EV battery pack performance and lifespan under diverse operating scenarios. In conclusion, mastering the interplay between temperature, capacity, and internal resistance is fundamental to unlocking the full potential of electric mobility, ensuring that the EV battery pack remains a reliable and efficient cornerstone of the transportation energy transition.

Scroll to Top