Table of Contents
Fetching ...

Neural posterior estimation for scalable and accurate inverse parameter inference in Li-ion batteries

Malik Hassanaly, Corey R. Randall, Peter J. Weddle, Paul J. Gasper, Conlain Kelly, Tanvir R. Tanim, Kandler Smith

Abstract

Diagnosing the internal state of Li-ion batteries is critical for battery research, operation of real-world systems, and prognostic evaluation of remaining lifetime. By using physics-based models to perform probabilistic parameter estimation via Bayesian calibration, diagnostics can account for the uncertainty due to model fitness, data noise, and the observability of any given parameter. However, Bayesian calibration in Li-ion batteries using electrochemical data is computationally intensive even when using a fast surrogate in place of physics-based models, requiring many thousands of model evaluations. A fully amortized alternative is neural posterior estimation (NPE). NPE shifts the computational burden from the parameter estimation step to data generation and model training, reducing the parameter estimation time from minutes to milliseconds, enabling real-time applications. The present work shows that NPE calibrates parameters equally or more accurately than Bayesian calibration, and we demonstrate that the higher computational costs for data generation are tractable even in high-dimensional cases (ranging from 6 to 27 estimated parameters), but the NPE method can lead to higher voltage prediction errors. The NPE method also offers several interpretability advantages over Bayesian calibration, such as local parameter sensitivity to specific regions of the voltage curve. The NPE method is demonstrated using an experimental fast charge dataset, with parameter estimates validated against measurements of loss of lithium inventory and loss of active material. The implementation is made available in a companion repository (https://github.com/NatLabRockies/BatFIT).

Neural posterior estimation for scalable and accurate inverse parameter inference in Li-ion batteries

Abstract

Diagnosing the internal state of Li-ion batteries is critical for battery research, operation of real-world systems, and prognostic evaluation of remaining lifetime. By using physics-based models to perform probabilistic parameter estimation via Bayesian calibration, diagnostics can account for the uncertainty due to model fitness, data noise, and the observability of any given parameter. However, Bayesian calibration in Li-ion batteries using electrochemical data is computationally intensive even when using a fast surrogate in place of physics-based models, requiring many thousands of model evaluations. A fully amortized alternative is neural posterior estimation (NPE). NPE shifts the computational burden from the parameter estimation step to data generation and model training, reducing the parameter estimation time from minutes to milliseconds, enabling real-time applications. The present work shows that NPE calibrates parameters equally or more accurately than Bayesian calibration, and we demonstrate that the higher computational costs for data generation are tractable even in high-dimensional cases (ranging from 6 to 27 estimated parameters), but the NPE method can lead to higher voltage prediction errors. The NPE method also offers several interpretability advantages over Bayesian calibration, such as local parameter sensitivity to specific regions of the voltage curve. The NPE method is demonstrated using an experimental fast charge dataset, with parameter estimates validated against measurements of loss of lithium inventory and loss of active material. The implementation is made available in a companion repository (https://github.com/NatLabRockies/BatFIT).

Paper Structure

This paper contains 36 sections, 13 equations, 11 figures, 9 tables.

Figures (11)

  • Figure 1: Schematic illustration of the neural net architectures used for CNPE (left) and the model surrogate (right). "CNN" refers to convolutional layers, and "FCNN" refers to fully connected layers.
  • Figure 2: Training results of the discharge surrogate. Left: train (black) and test (black) losses history. Right: discharge voltage curves obtained with different parameters $\theta$ with the SPM (blue) and the surrogate (red).
  • Figure 3: Train (black) and test (black) losses history for CNPE trained on the discharge comparison dataset. NLL denotes the batch-averaged negative log-likelihood.
  • Figure 4: Left: example of near-matching posteriors between Bayesian calibration (red) and CNPE (blue). Right: example of posterior mismatch between Bayesian calibration (red) and CNPE (blue). Bayesian calibration done with a more accurate surrogate is also shown (green). Solid lines (black) denote the true values.
  • Figure 5: Conditional average of the relative error for each calibrated parameter evaluated on the discharge dataset (blue) and the charge dataset (red).
  • ...and 6 more figures