Efficient prediction of potential energy surface and physical properties with Kolmogorov-Arnold Networks

Rui Wang , Hongyu Yu , Yang Zhong , Hongjun Xiang

Journal of Materials Informatics ›› 2024, Vol. 4 ›› Issue (4) : 32

PDF
Journal of Materials Informatics ›› 2024, Vol. 4 ›› Issue (4) :32 DOI: 10.20517/jmi.2024.46
Research Article

Efficient prediction of potential energy surface and physical properties with Kolmogorov-Arnold Networks

Author information +
History +
PDF

Abstract

The application of machine learning methods for predicting potential energy surface and physical properties within materials science has garnered significant attention. Among recent advancements, Kolmogorov-Arnold Networks (KANs) have emerged as a promising alternative to traditional Multi-Layer Perceptrons. This study evaluates the impact of substituting Multi-Layer Perceptrons with KANs within four established machine learning frameworks: Allegro, Neural Equivariant Interatomic Potentials, Higher Order Equivariant Message Passing Neural Network (MACE), and the Edge-Based Tensor Prediction Graph Neural Network. Our results demonstrate that the integration of KANs enhances prediction accuracies, especially for complex datasets such as the HfO2 structures. Notably, using KANs exclusively in the output block achieves the most significant improvements, improving prediction accuracy and computational efficiency. Furthermore, employing KANs exclusively in the output block facilitates faster inference and improved computational efficiency relative to utilizing KANs throughout the entire model. The selection of optimal basis functions for KANs depends on the specific problem. Our results demonstrate the strong potential of KANs in enhancing machine learning potentials and material property predictions. Additionally, the proposed methodology offers a generalizable framework that can be applied to other ML architectures.

Keywords

Machine learning / property prediction / Kolmogorov-Arnold Networks

Cite this article

Download citation ▾
Rui Wang, Hongyu Yu, Yang Zhong, Hongjun Xiang. Efficient prediction of potential energy surface and physical properties with Kolmogorov-Arnold Networks. Journal of Materials Informatics, 2024, 4(4): 32 DOI:10.20517/jmi.2024.46

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Frank JT,Müller KR.A Euclidean transformer for fast and stable machine learned force fields.Nat Commun2024;15:6539 PMCID:PMC11303804

[2]

Choung S,Moon J.Rise of machine learning potentials in heterogeneous catalysis: developments, applications, and prospects.Chem Eng J2024;494:152757

[3]

Tang D,Luber S.Machine learning interatomic potentials for heterogeneous catalysis.Chem A Eur J2024;30:e202401148

[4]

Damewood J,Lunger JR.Representations of materials for machine learning.Annu Rev Mater Res2023;53:399-426

[5]

Song Z,Meng F.Machine learning in materials design: algorithm and application*.Chinese Phys B2020;29:116103

[6]

Dieb S,Yin W.Optimization of depth-graded multilayer structure for x-ray optics using machine learning.J Appl Phy2020;128:074901

[7]

Cheng G,Yin WJ.Crystal structure prediction by combining graph network and optimization algorithm.Nat Commun2022;13:1492 PMCID:PMC8938491

[8]

Zendehboudi S,Lohi A.Applications of hybrid models in chemical, petroleum, and energy systems: a systematic review.Appl Energy2018;228:2539-66

[9]

Leukel J,Sugumaran V.Machine learning models for predicting physical properties in asphalt road construction: a systematic review.Constr Build Mater2024;440:137397

[10]

Musaelian A,Johansson A.Learning local equivariant representations for large-scale atomistic dynamics.Nat Commun2023;14:579 PMCID:PMC9898554

[11]

Batzner S,Sun L.E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials.Nat Commun2022;13:2453 PMCID:PMC9068614

[12]

Thölke P. Equivariant transformers for neural network based molecular potentials. International Conference on Learning Representations. 2022. Available from: https://openreview.net/forum?id=zNHzqZ9wrRB. [Last accessed on 27 Dec 2024].

[13]

Wang G,Zhang X,Zhou J.Machine learning interatomic potential: bridge the gap between small-scale models and realistic device-scale simulations.iScience2024;27:109673 PMCID:PMC11033164

[14]

Noda K.Prediction of potential energy profiles of molecular dynamic simulation by graph convolutional networks.Comput Mater Sci2023;229:112448

[15]

Yu H,Hong L.Spin-dependent graph neural network potential for magnetic materials.Phys Rev B2024;109:14426

[16]

Vandenhaute S,Dekeyser S,Van Speybroeck V.Machine learning potentials for metal-organic frameworks using an incremental learning approach.npj Comput Mater2023;9:1-8

[17]

Song K,Liu J. General-purpose machine-learned potential for 16 elemental metals and their alloys. Available from: http://arxiv.org/abs/2311.04732. [Last accessed on 27 Dec 2024].

[18]

Sun H,Tang L,Xia W.Molecular dynamics simulation of Fe-Si alloys using a neural network machine learning potential.Phys Rev B2023;107:224301

[19]

Kostiuchenko TS,Novikov IS.Interatomic interaction models for magnetic materials: recent advances.Chinese Phys Lett2024;41:066101

[20]

Fan Z,Vierimaa V.Efficient molecular dynamics simulations with many-body potentials on graphics processing units.Comput Phys Commun2017;218:10-6

[21]

Zhong Y,Gong X.A general tensor prediction framework based on graph neural networks.J Phys Chem Lett2023;14:6339-48

[22]

Zhong Y,Su M,Xiang H.Transferable equivariant graph neural networks for the hamiltonians of molecules and solids.npj Comput Mater2023;9:182

[23]

Zhong Y,Yang J,Xiang H.Universal machine learning kohn-sham hamiltonian for materials.Chinese Phys Lett2024;41:077103

[24]

Li H,Zou N.Deep-learning density functional theory hamiltonian for efficient ab initio electronic-structure calculation.Nat Comput Sci2022;2:367-77 PMCID:PMC11499279

[25]

Zhong Y,Zhang B.Accelerating the calculation of electron-phonon coupling strength with machine learning.Nat Comput Sci2024;4:615-25

[26]

Zhang C,Tao ZG. Advancing nonadiabatic molecular dynamics simulations for solids: achieving supreme accuracy and efficiency with machine learning. Available from: https://arxiv.org/html/2408.06654v1. [Last accessed on 27 Dec 2024].

[27]

Xie T.Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties.Phys Rev Lett2018;120:145301

[28]

Choudhary K.Atomistic line graph neural network for improved materials property predictions.npj Comput Mater2021;7:185

[29]

Choudhary K.Designing high-TC superconductors with BCS-inspired screening, density functional theory, and deep-learning.npj Comput Mater2022;8:244

[30]

Choudhary K,Sharma V,Walker ARH.High-throughput density functional perturbation theory and machine learning predictions of infrared, piezoelectric and dielectric responses.npj Comput Mater2020;6:64 PMCID:PMC11574872

[31]

Clayson IG,Hutereau M,Slater B.High throughput methods in the synthesis, characterization, and optimization of porous materials.Adv Mater2020;32:e2002780

[32]

Wang R,Zhong Y.Identifying direct bandgap silicon structures with high-throughput search and machine learning methods.J Phys Chem C2024;128:12677-85

[33]

Stergiou K,Varytis P,Karlsson P.Enhancing property prediction and process optimization in building materials through machine learning: a review.Comput Mater Sci2023;220:112031

[34]

Cybenko G.Approximation by superpositions of a sigmoidal function.Math Control Signal Syst1989;2:303-14

[35]

Hornik K,White H.Multilayer feedforward networks are universal approximators.Neural Netw1989;2:359-66

[36]

Liu Z,Vaidya S. KAN: Kolmogorov-Arnold Networks. Available from: http://arxiv.org/abs/2404.19756. [Last accessed on 27 Dec 2024].

[37]

Braun J.On a constructive proof of kolmogorov’s superposition theorem.Constr Approx2009;30:653-75

[38]

Arnol’d VI.On the representation of functions of several variables as a superposition of functions of a smaller number of variables. In: Givental AB, Khesin BA, Marsden JE, Varchenko AN, Vassiliev VA, Viro OY, Zakalyukin VM, editors. Collected Works. Berlin: Springer Berlin Heidelberg; 2009. pp. 25-46.

[39]

Li Z. Kolmogorov-Arnold Networks are radial basis function networks. Available from: http://arxiv.org/abs/2405.06721. [Last accessed on 27 Dec2024].

[40]

Bozorgasl Z. Wav-KAN: Wavelet Kolmogorov-Arnold Networks. Available from: https://arxiv.org/abs/2405.12832. [Last accessed on 27 Dec2024].

[41]

Xu J,Li J. FourierKAN-GCF: Fourier Kolmogorov-Arnold Network - an effective and efficient feature transformation for graph collaborative filtering. Available from: http://arxiv.org/abs/2406.01034. [Last accessed on 27 Dec2024].

[42]

Aghaei AA. fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions. Available from: http://arxiv.org/abs/2406.07456. [Last accessed on 27 Dec2024].

[43]

Reinhardt EAF,Gleyzer S. SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions. Available from: http://arxiv.org/abs/2407.04149. [Last accessed on 27 Dec2024].

[44]

Nagai Y, Okumura M. Kolmogorov-Arnold Networks in molecular dynamics. Available from: https://arxiv.org/abs/2407.17774. [Last accessed on 27 Dec2024].

[45]

Genet R. TKAN: Temporal Kolmogorov-Arnold Networks. Available from: https://arxiv.org/abs/2405.07344. [Last accessed on 27 Dec2024].

[46]

Kiamari M,Krishnamachari B. GKAN: Graph Kolmogorov-Arnold Networks. Available from: http://arxiv.org/abs/2406.06470. [Last accessed on 27 Dec2024].

[47]

Inzirillo H. SigKAN: Signature-Weighted Kolmogorov-Arnold Networks for rime series. Available from: http://arxiv.org/abs/2406.17890. [Last accessed on 27 Dec2024].

[48]

Bresson R,Panagopoulos G,Pang J. KAGNNs: Kolmogorov-Arnold Networks meet graph learning. Available from: http://arxiv.org/abs/2406.18380. [Last accessed on 27 Dec2024].

[49]

Wang Y,Bai J.Kolmogorov–arnold-informed neural network: a physics-informed deep learning framework for solving forward and inverse problems based on kolmogorov-arnold networks.Comput Methods Appl Mech Eng2025;433:117518

[50]

Batatia I,Simm GNC,Csanyi G. MACE: higher order equivariant message passing neural networks for fast and accurate force fields. 2022. Available from: https://openreview.net/forum?id=YPpSngE-ZU. [Last accessed on 27 Dec2024].

[51]

Gilmer J,Riley PF,Dahl GE. Neural message passing for quantum chemistry. In: Proceedings of the 34th International Conference on Machine Learning - Volume 70. Sydney, NSW, Australia: JMLR.org; 2017. p. 1263-72. (ICML’17). Available from: https://dl.acm.org/doi/10.5555/3305381.3305512. [Last accessed on 28 Dec2024].

[52]

Blealtan/efficient-kan. Available from: https://github.com/Blealtan/efficient-kan. [Last accessed on 27 Dec2024].

[53]

Kresse G.Ab initio molecular dynamics for liquid metals.Phys Rev B1993;47:558-61

[54]

Perdew JP,Ernzerhof M.Generalized gradient approximation made simple.Phys Rev Lett1997;78:1396-1396

[55]

Thompson AP,Berger R.LAMMPS-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales.Comput Phys Commun2022;271:108171

[56]

Wu J,Zhang L.Deep learning of accurate force field of ferroelectric HfO2.Phys Rev B2021;103:024108

[57]

Deringer VL.Machine learning based interatomic potential for amorphous carbon.Phys Rev B2017;95

[58]

Wang J,Zhang H.E(n)-equivariant cartesian tensor message passing interatomic potential.Nat Commun2024;15:7607 PMCID:PMC11366765

[59]

Fan Z,Ying P.GPUMD: a package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations.J Chem Phys2022;157:114801

[60]

Mumuni A.Data augmentation: a comprehensive survey of modern approaches.Array2022;16:100258

[61]

Lu Y,Wang H,van Rechem C,Wei W. Machine learning for synthetic data generation: a review. Available from: https://arxiv.org/abs/2302.04062. [Last accessed on 27 Dec2024].

[62]

Farahani A,Rasheed K.A brief review of domain adaptation. In: Stahlbock R, Weiss GM, Abou-nasr M, Yang C, Arabnia HR, Deligiannidis L, editors. Advances in data science and information engineering. Cham: Springer International Publishing; 2021. pp. 877-94.

[63]

Zhuang F,Duan K.A comprehensive survey on transfer learning.Proc IEEE2021;109:43-76

[64]

Chen C.A universal graph deep learning interatomic potential for the periodic table.Nat Comput Sci2022;2:718-28

[65]

Deng B,Jun K.CHGNet as a pretrained universal neural network potential for charge-informed atomistic modelling.Nat Mach Intell2023;5:1031-41

[66]

Arabha S,Ghorbani K,Rajabpour A.Recent advances in lattice thermal conductivity calculation using machine-learning interatomic potentials.J Appl Phys2021;130:210903

[67]

Qian X.Machine learning for predicting thermal transport properties of solids.Mater Sci Eng R Rep2021;146:100642

[68]

Mortazavi B,Rabczuk T.Atomistic modeling of the mechanical properties: the rise of machine learning interatomic potentials.Mater Horiz2023;10:1956-68

[69]

Mortazavi B,Roche S,Zhuang X.Machine-learning interatomic potentials enable first-principles multiscale modeling of lattice thermal conductivity in graphene/borophene heterostructures.Mater Horiz2020;7:2359-67

[70]

Luo Y,Yuan H,Fang Y.Predicting lattice thermal conductivity via machine learning: a mini review.npj Comput Mater2023;9:964

[71]

Kim Y,Kim Y,Ryu S.Designing an adhesive pillar shape with deep learning-based optimization.ACS Appl Mater Interfaces2020;12:24458-65

[72]

Yu CH,Chiang YH.End-to-end deep learning model to predict and design secondary structure content of structural proteins.ACS Biomater Sci Eng2022;8:1156-65 PMCID:PMC9347213

[73]

Zhang Z,Di Caprio F.Machine learning for accelerating the design process of double-double composite structures.Compos Struct2022;285:115233

[74]

Lee J,Lee M.Machine learning-based inverse design methods considering data characteristics and design space size in materials design and manufacturing: a review.Mater Horiz2023;10:5436-56

AI Summary AI Mindmap
PDF

86

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/