You may have seen my recent blogs on Lithium batteries. When it comes to Lithium batteries, the most important design aspect (and also the most frequently overlooked) is battery management. Lithiums, like other rechargeable chemistries, require some intelligence while charging. Unlike other chemistries, they are much less forgiving to improper charging (risk of internal damage or leakage). Maintaining proper cell voltage, temperature, and charging current is critical to insure that the health of the cell is maintained.
It is imperative that Lithium batteries be maintained within their specified voltage range. Lithium Iron Phosphate (LiFePO4) chemistries typically want to be operated between 2.0 and 4.0 volts. If the cell is allowed to run down below this voltage, or charged above this voltage, cell damage may occur. This could come, at the minimum, in the form of diminished capacity, or, at the maximum, in a thermal event that could result in venting, physical damage, or collateral damage to surrounding items. Every chemistry is different and some of the more volatile chemistries can produce a much larger thermal event similar to what Boeing and others have faced. Typically, an under-voltage event will not cause anything other than a loss of capacity, but over-voltage usually results in over-temperature (if not otherwise controlled) which can result in thermal runaway and physical damage.
Temperature monitoring is crucial. This is the only real physical characteristic that something is going wrong with the cell. The following chart, taken from Electropedia (https://www.mpoweruk.com/lithium_failures.htm) provides a great overview of what can happen when operating at unsafe temperatures:
Some applications may require active cooling (common now in the EV market). Use at higher currents creates heat, in what might already be a warm environment. Regardless of whether there is cooling, monitoring is an absolute must, coupled with controls to be able to disable use or charging when cells begin to reach an unsafe temperature. This is not just to avoid damage to a cell, but to maintain the safety of the system.
As you may suspect, charging current needs to be monitored (and managed) as well. Charge requirements vary by chemistry and manufacturer, but generally Lithium cells are charged initially at a constant current, then as the target “charged” voltage is reached, charge current is dropped. Most Lithium cells can handle ½ C (or more). C refers to the capacity of the cell in Amp-hours (Ah). A 2 Ah cell then could be charged initially at a rate of 1A, then fall back to a rate of about 0.2A as the target voltage is reached. Max charge currents vary quite a bit, but exceeding the max recommended or operating at the high side of the recommended charge current generally results in excess heat.
Charge controller ICs (for one or a few cells) or full-blown battery management systems or BMSs (for larger arrays) have been employed to manage charging, voltage and temperature monitoring, and charge status. In some cases these incorporate shunts to allow an array of cells in series to shunt charge current around them once they are full (to better balance the pack). As you may know, in a series array of cells your total charge capacity (in Ah) is only as great as the weakest cell. If you continue to draw current or run a series array past what the lowest common denominator can deliver, you may damage that cell.
Before developing (or choosing) a battery management scheme, check to see what is available and what is typically used for your technology or your application. Also, as with any technology, consult your supplier or manufacturer for the proper technical specifications and min/max operating parameters. Your particular component or application may have different requirements or special caveats.