{"publisher":"American Chemical Society","corr_author":"1","_id":"20926","external_id":{"pmid":["41368735 "],"arxiv":["2507.14302"]},"day":"10","OA_type":"green","scopus_import":"1","oa_version":"Preprint","date_updated":"2026-01-05T11:34:21Z","citation":{"short":"D. Kim, X. Wang, S. Vargas, P. Zhong, D.S. King, T.J. Inizan, B. Cheng, Journal of Chemical Theory and Computation 21 (2025) 12709–12724.","ista":"Kim D, Wang X, Vargas S, Zhong P, King DS, Inizan TJ, Cheng B. 2025. A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials. Journal of Chemical Theory and Computation. 21(24), 12709–12724.","apa":"Kim, D., Wang, X., Vargas, S., Zhong, P., King, D. S., Inizan, T. J., & Cheng, B. (2025). A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials. Journal of Chemical Theory and Computation. American Chemical Society. https://doi.org/10.1021/acs.jctc.5c01400","chicago":"Kim, Dongjin, Xiaoyu Wang, Santiago Vargas, Peichen Zhong, Daniel S. King, Theo Jaffrelot Inizan, and Bingqing Cheng. “A Universal Augmentation Framework for Long-Range Electrostatics in Machine Learning Interatomic Potentials.” Journal of Chemical Theory and Computation. American Chemical Society, 2025. https://doi.org/10.1021/acs.jctc.5c01400.","mla":"Kim, Dongjin, et al. “A Universal Augmentation Framework for Long-Range Electrostatics in Machine Learning Interatomic Potentials.” Journal of Chemical Theory and Computation, vol. 21, no. 24, American Chemical Society, 2025, pp. 12709–24, doi:10.1021/acs.jctc.5c01400.","ama":"Kim D, Wang X, Vargas S, et al. A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials. Journal of Chemical Theory and Computation. 2025;21(24):12709-12724. doi:10.1021/acs.jctc.5c01400","ieee":"D. Kim et al., “A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials,” Journal of Chemical Theory and Computation, vol. 21, no. 24. American Chemical Society, pp. 12709–12724, 2025."},"year":"2025","status":"public","main_file_link":[{"open_access":"1","url":"https://doi.org/10.48550/arXiv.2507.14302"}],"date_created":"2026-01-04T23:01:33Z","volume":21,"arxiv":1,"oa":1,"page":"12709-12724","pmid":1,"department":[{"_id":"GradSch"},{"_id":"BiCh"}],"month":"12","title":"A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","article_type":"original","article_processing_charge":"No","type":"journal_article","quality_controlled":"1","publication_identifier":{"eissn":["1549-9626"],"issn":["1549-9618"]},"abstract":[{"text":"Most current machine learning interatomic potentials (MLIPs) rely on short-range approximations, without explicit treatment of long-range electrostatics. To address this, we recently developed the Latent Ewald Summation (LES) method, which infers electrostatic interactions, polarization, and Born effective charges (BECs), just by learning from energy and force training data. Here, we present LES as a standalone library, compatible with any short-range MLIP, and demonstrate its integration with methods such as MACE, NequIP, Allegro, CACE, CHGNet, and UMA. We benchmark LES-enhanced models on distinct systems, including bulk water, polar dipeptides, and gold dimer adsorption on defective substrates, and show that LES not only captures correct electrostatics but also improves accuracy. Additionally, we scale LES to large and chemically diverse data by training MACELES-OFF on the SPICE set containing molecules and clusters, making a universal MLIP with electrostatics for organic systems, including biomolecules. MACELES-OFF is more accurate than its short-range counterpart (MACE-OFF) trained on the same data set, predicts dipoles and BECs reliably, and has better descriptions of bulk liquids. By enabling efficient long-range electrostatics without directly training on electrical properties, LES paves the way for electrostatic foundation MLIPs.","lang":"eng"}],"language":[{"iso":"eng"}],"issue":"24","intvolume":" 21","doi":"10.1021/acs.jctc.5c01400","acknowledgement":"Research reported in this publication was supported by the National Institute Of General Medical Sciences of the National Institutes of Health under Award Number R35GM159986. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. D.K. and B.C. acknowledge funding from Toyota Research Institute Synthesis Advanced Research Challenge. T.J.I., D.S.K. and P.Z. acknowledge funding from BIDMaP Postdoctoral Fellowship. T.J.I. used resources of the National Energy Research Scientific Computing Center (NERSC), a Department of Energy Office of Science User Facility using NERSC award DOEERCAP0031751 ′GenAI@NERSC’. The authors thank Bowen Deng for valuable discussions on MatGL implementation, and thank Gabor Csanyi for stimulating discussions.","OA_place":"repository","publication_status":"published","author":[{"first_name":"Dongjin","last_name":"Kim","full_name":"Kim, Dongjin"},{"id":"8dff9c62-32b0-11ee-9fa8-fc73025e10f3","last_name":"Wang","first_name":"Xiaoyu","full_name":"Wang, Xiaoyu"},{"last_name":"Vargas","first_name":"Santiago","full_name":"Vargas, Santiago"},{"full_name":"Zhong, Peichen","first_name":"Peichen","last_name":"Zhong"},{"first_name":"Daniel S.","last_name":"King","full_name":"King, Daniel S."},{"last_name":"Inizan","first_name":"Theo Jaffrelot","full_name":"Inizan, Theo Jaffrelot"},{"full_name":"Cheng, Bingqing","last_name":"Cheng","id":"cbe3cda4-d82c-11eb-8dc7-8ff94289fcc9","first_name":"Bingqing","orcid":"0000-0002-3584-9632"}],"publication":"Journal of Chemical Theory and Computation","date_published":"2025-12-10T00:00:00Z"}