Battery Fuel Gauge: Factual or Fallacy?

Many perceive a battery as being an energy storage device that is similar to a fuel tank dispensing liquid fuel. For simplicity, a battery can be seen as such; however, measuring stored energy from an electrochemical device is far more complex. The process is fraught with confusion, is poorly understood, and this article describes the challenges of measuring energy from a battery.

Before looking into the fuel gauge concept deeper, we assume that state-of-charge (SoC) is the relative stored energy in a battery that can be released under prevailing conditions. The prevailing conditions are mostly unknown to the battery user, and besides SoC they include the actual battery capacity, load currents and operating temperature. State-of-function (SoF), the all-encompassing criteria that includes SoC, capacity and delivery, is difficult to measure and remains mostly guesswork. Considering these limitations, one can appreciate why most battery fuel gauges are inaccurate.

Unlike a fuel tank that has a known volumetric dimension, the fuel gauge of a battery has unconfirmed definitions. Other than the open circuit voltage (OCV), which only approximates SoC, a battery does not have fundamental internal parameters that relate to SoC. The Ah rating, which the manufacturer specifies, only applies for the short time when the battery is new. In essence, a battery is a shrinking vessel that takes less energy with each subsequent charge, and the stated Ah rating is only a reference of what the battery should be holding. The battery is not an energy container per se that guarantees a given amount of energy under all conditions but exhibits a human quality delivering on prevailing situations.  

A common error in fuel gauge design is ignoring the aging aspect by assuming that the battery will stay perfect. Such oversight will limit the service to about two years before the readings become inaccurate. The scaling of most fuel gauges is analogous to liquid fuel: full charge indicates 100% and empty is zero percent. Zero is the point when the battery reaches the low voltage knee at the end of discharge.

Discharging a battery rated at 1Ah should provide a current of 1A for one hour. This only holds true while the battery is new and discharged at room temperature. If the capacity shrinks to 50%, the fuel gauge of a fully charged battery will still show 100% but the expected one-hour runtime is reduced to 30 minutes. Running the battery below freezing reduces the time further. For the casual cellphone or laptop user, this error only causes inconvenience; however, the problem becomes more evident with electric vehicles and other critical battery operated devices that depend on the remaining runtime to reach the destination.

Modern fuel gauges adapt to prevailing conditions by “learning” how much energy the battery was able to deliver on the previous discharge. Learning, or trending, may also include charge time because a faded battery charges quicker than a good one. It is also common to measure the internal battery resistance by observing the voltage drop; however, capacity estimation based on raising resistance no longer works well because the modern Li-ion maintains low resistance through most of its service life.

Capacity is best measured by discharging a fully charged battery at a constant current and reading the elapsed time. Most rechargeable batteries for portable use are specified at 1C discharge. A battery rated at 1Ah would therefore discharge at 1A. The rated discharge of primary cells, such as alkaline, is much lower. Measuring battery capacity by discharge/charge is impractical and stresses the battery.

Calibration

The fuel gauge has the inherent drawback of needing periodic calibration, also known as capacity re-learning. This is done to correct the tracking error that develops between the chemical and digital battery on repeated charge and discharge cycles. Calibration could be omitted if the battery received a periodic full discharge at constant current followed by a full charge. The battery would reset with each full cycle and the tracking error would be kept at less than one percent per cycle. In real life, however, a battery may be discharged for a few minutes with a load signature that is difficult to capture, then partially recharged and stored with varying levels of self-discharge depending temperature and age.   

Manual calibration is possible by running the battery down until “Low Battery” appears. This can be done in the equipment or with a battery analyzer. A full discharge sets the discharge flag and the subsequent recharge the charge flag. Establishing these two markers allows SoC to be calculated by tracking the distance between the flags. For best results, calibrate a device in continuous use every three months or after 40 partial cycles. If the device applies a periodic deep discharge on its own accord, no additional calibration will be required. Figure 1 shows the full-discharge and full-charge flags.

Full-discharge and full-charge flags set calibration

 

Figure 1: Full-discharge and full-charge flags set calibration

Calibration occurs by applying a full charge, discharge and charge. This can be done in the equipment or with a battery analyzer as part of battery maintenance.

Courtesy Cadex

What happens if the battery is not calibrated regularly? Can such a battery be used with confidence? Most smart battery chargers obey the dictates of the chemical battery rather than the electronic circuit and there are no safety concerns if out of calibration. The battery will charge fully and function normally but the digital readout may be inaccurate and become a nuisance.

Measure State-of-charge

Measuring state-of-charge by voltage is simple but it can be inaccurate and here is why. Batteries within a given chemistry have dissimilar architectures and deliver unique voltage profiles. Temperature also plays a role; heat raises the voltage, a cold ambient lowers it. This phenomenon applies to all chemistries in varying degrees. Furthermore, when disturbing the battery with a charge or discharge, the open circuit voltage no longer represents the true SoC reading and the battery will require a few hours of rest to regain equilibrium; battery manufacturers recommend 24 hours. While lead acid batteries have a gradual voltage drop on discharge, nickel- and lithium-based batteries tend to be flat and SoC estimation by voltage is difficult.  Consumer products using the voltage-based fuel gauges limit the readout to show full charge, mid-range and low charge.

A more advanced method to measure SoC is coulomb counting. The theory that goes back 250 years when Charles-Augustin de Coulomb first established the “Coulomb Rule.” It works on the principle of measuring in and out flowing currents. Figure 2 illustrates the principle graphically.

Principle of a fuel gauge based on coulomb counting

 

Figure 2: Principle of a fuel gauge based on coulomb counting

The stored energy represents state-of-charge; a circuit measures the in-and-out flowing current.

Courtesy of Cadex

Coulomb counting experiences errors as well. For example, if a battery was charged for one hour at one ampere, the same amount of energy should be available on discharge. This is not the case. Inefficiencies in charge acceptance, especially towards the end of charge, as well as losses during discharge and storage reduce the total energy delivered and skew the readings. The available energy is always less than what had been fed into the battery. For example, the energy cycle (charging and then discharging) of the Li-ion batteries in the Tesla Roadster is about 86% efficient.

As with any numeric integration technique, coulomb counting accumulates error over time, which the modern fuel gauge tries to correct using voltage curves. Since these voltage curves harbor inaccuracies themselves, especially as the battery ages, the accuracy will continue to degrade over time.

The Adaptive System on Diffusion (ASOD) by Cadex features a unique “Learn” function that adjusts to battery aging and achieves a capacity estimation of +/-2% across 1,000 battery cycles, the typical life span of a battery. The SoC estimation is within +/-5%, independent of age and polarization. ASOD does not require outside parameters; it is self-learning. When replacing the battery, the learned matrix will gradually adapt to the new battery through use and achieve the previous high accuracy again.

Researchers are exploring new methods to measure battery SoC, and such an innovative technology is quantum magnetism (Q-Mag™). Q-Mag by Cadex does not rely on voltage or current flow but it looks at magnetism. The negative plate on a discharging lead acid battery changes from lead to lead sulfate, which has a different magnetic susceptibility than lead. A sensor based on a quantum mechanical process reads the magnetic field through a process called tunneling.  Figure 3 compares the magnetic field response of a fully charged battery to one that is 20% charged. A low battery has a three-fold increase in magnetic susceptibility compared to one that is fully charged.

SoC by magnetic field response

SoC by magnetic field response

Figure 3: SoC by magnetic field response

The permeability of the plates increases by a factor of 3 from full charge to empty. TMR is Tunneling Magneto Resistance, also known as magnetic tunneling junction (MTJ)

 

It is conceivable that a new technology has been found that can measure battery SoC with an accuracy that was not imaginable before. Knowing the precise intrinsic SoC enables improved chargers but more importantly, the technology provides battery diagnostics, including capacity estimation and end-of-life prediction. The immediate benefit, however, lies in building a better and more accurate fuel gauge.

Li-ion, including lithium iron phosphate, has a very flat discharge curve. Figure 4 shows a linear drop of the relative magnetic field units on discharge and a corresponding raise on charge when monitored with Q-Mag. There is no rubber-band effect associated with the voltage method in which discharge lowers the voltage and charge raises it. Q-Mag takes readings while the battery is being charged or is under load. The SoC accuracy with Li-ion is +/-5%, lead acid is +/-7%; calibration occurs by applying a full charge. The excitation current is less than 1mA, and the system is immune to most interference. Q-Mag works with cells encased in foil, aluminum, stainless steel, but not ferrous metals. The tests are conducted in the laboratories of Cadex Electronics Inc.

Magnetic field measurements of a lithium iron phosphate during charge and discharge

Figure 4: Magnetic field measurements of a lithium iron phosphate during charge and discharge.

Relative magnetic field units provide accurate state-of-charge of lithium- and lead-based batteries.

 

Summary

SoC measurements consist of several readings, and the most common ones are voltage, current and coulomb counting. While the accuracy of these systems is good enough for consumer products where a false indication only causes mild annoyance, medical and military devices, as well as the electric vehicle demand more. A stranded motorist with a mistaken empty battery would attract more media attention than a dropped call of a dead cell phone or a computer screen going dark too soon.

Although noticeable improvements have been made in SoC accuracies, further advances are needed and innovative new technologies are promising. They will not only provide better SoC accuracies but offer state-of-health and end-of-life prediction. Scientists predict that these developments will be available at competitive prices. With these forward-looking technologies in mind, the modern battery fuel gauge will no longer remain a fallacy but become factual.

Comments

On December 16, 2011 at 12:22pm
paul walling wrote:

“Discharging a battery rated at 1Ah should provide a current of 1A for one hour”

Hold on - that’s not true!

Battery capacity is rated on the basis of discharge over several hours - normally 10 or 20 hours. So while a 1AHr battery should provide 1AHr IF discharged over its rated period (say 10 hours at 100mA or 20Hrs at 50mA), it is most unlikely to be capable of providing a current of 1A for an hour.

It’s not like Battery University to get such basic stuff wrong - so hope you don’t mind me pointing out this little oopsie…

On December 16, 2011 at 2:17pm
Noud Vermeulen wrote:

Paul’s reaction is a bit premature, as he forgets to mention the sentence before his quote. The right quote should be: ‘Most rechargeable batteries for portable use are specified at 1C discharge. A battery rated at 1Ah would therefore discharge at 1A.’. And that is true.
Sure, lead-acid batteries are advertised with capacities based on C/20; but that is also a portion of the truth…......
Thanks for the article, it is very informative indeed.

On December 16, 2011 at 5:30pm
Pradeep Chandra Pant wrote:

IT WOULD BE OF IMMENSE HELP TO READERS IF BATTERY UNIVERSITY PROVIDES A TABLE MENTIONING THE NORMAL CONVENTION OF RATING THE CAPACITY OF EACH TYPE OF CELL I.E. CELL TYPE, RATE OF DISCHARGE (AS NORMAL CONVENTION), TEMPERATURE, CUT-OFF VOLTAGE AND CAPACITY.

On December 16, 2011 at 8:43pm
Delmar C. Reymplds wrote:

I am designing a large Solar / Wind Power Project at one of our large Hospitals. As you know, Batteries are the most important part of any Solar or Wind projects in the ability to store Electrical Energy. I would like for your company to provide me with Battery Info that we could use in our specifications.

Delmar C. Reynolds

E-Mail dcragnes@att.net

On December 17, 2011 at 3:54pm
Martin C Nation MD wrote:

Thank you for this article. This is a huge area of relevance in these times. I am intrigued with TMR. Where can I find out more information? It seems it would need to be built into batteries. Or could it be provided as a component for use in assembling batteries at least in research?

Thank you Battery U
Martin C Nation, MD

On December 18, 2011 at 8:36pm
Jonathan Weiss wrote:

Concerning the subtleties stated above related to the state-of-charge of, let us say, a lead-acid battery, would an automated measurement of the concentration of sulfuric acid in the electrolyte be of assistance? Suppose, further, that it is done in all the cells simultaneously?

On December 19, 2011 at 11:50am
Joseph Accetta wrote:

Please note that a direct state of charge sensor for lead acid batteries has been developed by JSA Photonics, www.jsaphotonics.com.

On December 23, 2011 at 3:18am
Juergen Pintaske wrote:

Thanks for the excellent article. It gets even more difficult if you use Li ION cells in series e.g. 4 cells for 14.4V and strings of these in parallel to get higher capacity. I assume then coulomb counting is the only option, at least to get a guidance. Any other option? And you still have no control over the weakest cell ? Are there 6800 voltage measurements to check all cells in the Tesla?

On February 12, 2012 at 1:04am
Rajnikant Jaipuriya wrote:

Dear sir good after noon
I want to know 2voltage 1500ah 24 cell Battery bank capicity on load 150amp
OR flot voltage min.& max. 
BOOST Charging vol. & Flot vol.

THANX & OBEDENTLY
RAJNIKANT JAIPURIYA

On April 13, 2014 at 9:55am
sathishkumar wrote:

thanks for valuable information
i am trying to understand the BMS sampling time for cell voltage and battery pack current. its not yet clear for me. how i can finalize the sampling time.

thanks in advance for your advice.
sathishkumar