A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials

Kim D, Wang X, Vargas S, Zhong P, King DS, Inizan TJ, Cheng B. 2025. A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials. Journal of Chemical Theory and Computation. 21(24), 12709–12724.

Download (ext.)

Journal Article | Published | English

Scopus indexed
Author
Kim, Dongjin; Wang, XiaoyuISTA; Vargas, Santiago; Zhong, Peichen; King, Daniel S.; Inizan, Theo Jaffrelot; Cheng, BingqingISTA

Corresponding author has ISTA affiliation

Abstract
Most current machine learning interatomic potentials (MLIPs) rely on short-range approximations, without explicit treatment of long-range electrostatics. To address this, we recently developed the Latent Ewald Summation (LES) method, which infers electrostatic interactions, polarization, and Born effective charges (BECs), just by learning from energy and force training data. Here, we present LES as a standalone library, compatible with any short-range MLIP, and demonstrate its integration with methods such as MACE, NequIP, Allegro, CACE, CHGNet, and UMA. We benchmark LES-enhanced models on distinct systems, including bulk water, polar dipeptides, and gold dimer adsorption on defective substrates, and show that LES not only captures correct electrostatics but also improves accuracy. Additionally, we scale LES to large and chemically diverse data by training MACELES-OFF on the SPICE set containing molecules and clusters, making a universal MLIP with electrostatics for organic systems, including biomolecules. MACELES-OFF is more accurate than its short-range counterpart (MACE-OFF) trained on the same data set, predicts dipoles and BECs reliably, and has better descriptions of bulk liquids. By enabling efficient long-range electrostatics without directly training on electrical properties, LES paves the way for electrostatic foundation MLIPs.
Publishing Year
Date Published
2025-12-10
Journal Title
Journal of Chemical Theory and Computation
Publisher
American Chemical Society
Acknowledgement
Research reported in this publication was supported by the National Institute Of General Medical Sciences of the National Institutes of Health under Award Number R35GM159986. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. D.K. and B.C. acknowledge funding from Toyota Research Institute Synthesis Advanced Research Challenge. T.J.I., D.S.K. and P.Z. acknowledge funding from BIDMaP Postdoctoral Fellowship. T.J.I. used resources of the National Energy Research Scientific Computing Center (NERSC), a Department of Energy Office of Science User Facility using NERSC award DOEERCAP0031751 ′GenAI@NERSC’. The authors thank Bowen Deng for valuable discussions on MatGL implementation, and thank Gabor Csanyi for stimulating discussions.
Volume
21
Issue
24
Page
12709-12724
ISSN
eISSN
IST-REx-ID

Cite this

Kim D, Wang X, Vargas S, et al. A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials. Journal of Chemical Theory and Computation. 2025;21(24):12709-12724. doi:10.1021/acs.jctc.5c01400
Kim, D., Wang, X., Vargas, S., Zhong, P., King, D. S., Inizan, T. J., & Cheng, B. (2025). A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials. Journal of Chemical Theory and Computation. American Chemical Society. https://doi.org/10.1021/acs.jctc.5c01400
Kim, Dongjin, Xiaoyu Wang, Santiago Vargas, Peichen Zhong, Daniel S. King, Theo Jaffrelot Inizan, and Bingqing Cheng. “A Universal Augmentation Framework for Long-Range Electrostatics in Machine Learning Interatomic Potentials.” Journal of Chemical Theory and Computation. American Chemical Society, 2025. https://doi.org/10.1021/acs.jctc.5c01400.
D. Kim et al., “A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials,” Journal of Chemical Theory and Computation, vol. 21, no. 24. American Chemical Society, pp. 12709–12724, 2025.
Kim D, Wang X, Vargas S, Zhong P, King DS, Inizan TJ, Cheng B. 2025. A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials. Journal of Chemical Theory and Computation. 21(24), 12709–12724.
Kim, Dongjin, et al. “A Universal Augmentation Framework for Long-Range Electrostatics in Machine Learning Interatomic Potentials.” Journal of Chemical Theory and Computation, vol. 21, no. 24, American Chemical Society, 2025, pp. 12709–24, doi:10.1021/acs.jctc.5c01400.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]

Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

PMID: 41368735
PubMed | Europe PMC

arXiv 2507.14302

Search this title in

Google Scholar