• 青年述评 •    下一篇

机器学习原子间相互作用建模

王涵1,2   

  1. 1. 北京应用物理与计算数学研究所计算物理实验室, 北京 100094;
    2. 北京大学工学院应用物理与技术中心, 北京 100871
  • 收稿日期:2021-07-04 出版日期:2021-08-15 发布日期:2021-08-20
  • 作者简介:王涵,北京应用物理与计算数学研究所特聘研究员,博士生导师.主要研究兴趣为分子动力学模拟中的多尺度建模与计算方法.与合作者发展了基于深度学习的原子间相互作用建模与计算方法,将第一原理精度分子动力学模拟规模推进至亿原子量级.2019年受北京市科学技术协会青年人才托举工程资助,并获得中国数学会计算数学分会青年创新奖.2020年获得ACM戈登贝尔奖,并入选当年两院院士评选的中国十大科技进展.
  • 基金资助:
    国家自然科学基金(11871110)资助.

王涵. 机器学习原子间相互作用建模[J]. 计算数学, 2021, 43(3): 261-278.

Wang Han. MOLECULAR MODELING BY MACHIN LEARNING[J]. Mathematica Numerica Sinica, 2021, 43(3): 261-278.

MOLECULAR MODELING BY MACHIN LEARNING

Wang Han1,2   

  1. 1. Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Fenghao East Road 2, Beijing 100094, China;
    2. HEDPS, CAPT, Peking University, Beijing 100871, China
  • Received:2021-07-04 Online:2021-08-15 Published:2021-08-20
原子间相互作用建模是分子动力学模拟的核心问题之一.基于第一性原理的建模准而不快,经验势模型快而不准,因此人们长期面临精度和效率只得其一的两难困境.基于机器学习的原子间相互作用建模在达到第一性原理精度的同时,计算开销大大降低,因而有希望解决这一两难困境.本文将介绍构造基于机器学习的原子间相互作用模型的一般框架,归纳近年来的主要建模工作,并探讨这些工作的优势和劣势.
Modeling the interatomic potential is one of the crucial problems in the field of molecular simulation. For a long time, the community faces the dilemma that the first-principles calculations are accurate but slow, while the empirical force fields are efficient but inaccurate. Machine learning is a promising approach to solve the dilemma because it achieves comparable accuracy with the first-principles calculations at a much lower expense. In this review, we present a general framework for developing the machine learning interatomic potentials, provide an incomplete list of recent work in this direction, and investigate the advantages and disadvantages of the reviewed approaches.

MR(2010)主题分类: 

()
[1] Abrams C, Bussi G. Enhanced sampling in molecular dynamics using metadynamics, replicaexchange, and temperature-acceleration[J]. Entropy, 2014, 16(1):163-199.
[2] Bernardi R C, Melo M C, Schulten K. Enhanced sampling techniques in molecular dynamics simulations of biological systems[J]. Biochimica et Biophysica Acta (BBA)-General Subjects, 2015, 1850(5):872-877.
[3] Yang Y I, Shao Q, Zhang J, Yang L, Gao Y Q. Enhanced sampling in molecular dynamics[J]. The Journal of chemical physics, 2019, 151(7):070902.
[4] Born M, Oppenheimer R. Zur quantentheorie der molekeln[J]. Annalen der physik, 1927, 389(20):457-484.
[5] Schrödinger E. An undulatory theory of the mechanics of atoms and molecules[J]. Physical review, 1926, 28(6):1049.
[6] Hohenberg P, Kohn W. Inhomogeneous electron gas[J]. Physical review, 1964, 136(3B):B864.
[7] Kohn W, Sham L J. Self-consistent equations including exchange and correlation effects[J]. Physical review, 1965, 140(4A):A1133.
[8] Møller C, Plesset M. Note on an approximation treatment for many-electron systems[J]. Physical review, 1934, 46(7):618.
[9] Čížek J. On the correlation problem in atomic and molecular systems. Calculation of wavefunction components in Ursell-type expansion using quantum-field theoretical methods[J]. The Journal of Chemical Physics, 1966, 45(11):4256-4266.
[10] Car R, Parrinello M. Unified approach for molecular dynamics and density-functional theory[J]. Physical review letters, 1985, 55(22):2471.
[11] Daura X, Mark A E, Gunsteren W F V. Parametrization of aliphatic CHn united atoms of GROMOS96 force field[J]. Journal of Computational Chemistry, 1998, 19(5):535-547.
[12] Schuler L D, Daura X, Gunsteren W F V. An improved GROMOS96 force field for aliphatic hydrocarbons in the condensed phase[J]. Journal of Computational Chemistry, 2001, 22(11):1205-1218.
[13] MacKerell A, Bashford D, Bellott M, Dunbrack R, Evanseck J, Field M, Fischer S, Gao J, Guo H, Ha S, Joseph-McCarthy S, Kuchnir L, Kuczera K, Lau F, Mattos C, Michnick S, Ngo T, Nguyen D, Prodhom B, III W R, Roux B, Schlenkrich M, Smith J, Stote R, Straub J, Watanabe M, Wiórkiewicz-Kuczera J, Yin D, Karplus M. All-atom empirical potential for molecular modeling and dynamics studies of proteins[J]. The Journal of Physical Chemistry B, 1998, 102(18):3586-3616.
[14] Vanommeslaeghe K, Hatcher E, Acharya C, S. Kundu S Z, Shim J, Darian E, Guvench O, Lopes P, Vorobyov I, Jr. A M. CHARMM general force field:A force field for drug-like molecules compatible with the CHARMM all-atom additive biological force fields[J]. Journal of computational chemistry, 2010, 31(4):671-690.
[15] MacKerell A D, Feig M, III C L B. Extending the treatment of backbone energetics in protein force fields:Limitations of gas-phase quantum mechanics in reproducing protein conformational distributions in molecular dynamics simulations[J]. Journal of computational chemistry, 2004, 25(11):1400-1415.
[16] Yu Y, Krämer A, Venable R M, Brooks B R, Klauda J B, Pastor R W. CHARMM36 Lipid Force Field with Explicit Treatment of Long-Range Dispersion:Parametrization and Validation for Phosphatidylethanolamine, Phosphatidylglycerol, and Ether Lipids[J]. Journal of Chemical Theory and Computation, 2021.
[17] Wang J, Wolf R M, Caldwell J W, Kollman P A, Case D A. Development and testing of a general amber force field[J]. Journal of computational chemistry, 2004, 25(9):1157-1174.
[18] Hornak V, Abel R, Okur A, Strockbine B, Roitberg A, Simmerling C. Comparison of multiple Amber force fields and development of improved protein backbone parameters[J]. Proteins:Structure, Function, and Bioinformatics, 2006, 65(3):712-725.
[19] Lindorff-Larsen K, Piana S, Palmo K, Maragakis P, Klepeis J L, Dror R O, Shaw D E. Improved side-chain torsion potentials for the Amber ff99SB protein force field[J]. Proteins:Structure, Function, and Bioinformatics, 2010, 78(8):1950-1958.
[20] Jorgensen W, Maxwell D, Tirado-Rives J. Development and testing of the OPLS all-atom force field on conformational energetics and properties of organic liquids[J]. Journal of the American Chemical Society, 1996, 118(45):11225-11236.
[21] Duin A C V, Dasgupta S, Lorant F, Goddard W A. ReaxFF:a reactive force field for hydrocarbons[J]. The Journal of Physical Chemistry A, 2001, 105(41):9396-9409.
[22] Chenoweth K, Duin A C V, Goddard W A. ReaxFF reactive force field for molecular dynamics simulations of hydrocarbon oxidation[J]. The Journal of Physical Chemistry A, 2008, 112(5):1040-1053.
[23] Senftle T P, Hong S, Islam M M, Kylasa S B, Zheng Y, Shin Y K, Junkermeier C, EngelHerbert R, Janik M J, Aktulga H M, et al. The ReaxFF reactive force-field:development, applications and future directions[J]. npj Computational Materials, 2016, 2(1):1-14.
[24] Daw M S, Baskes M I. Embedded-atom method:Derivation and application to impurities, surfaces, and other defects in metals[J]. Physical Review B, 1984, 29(12):6443.
[25] Daw M S, Foiles S M, Baskes M I. The embedded-atom method:a review of theory and applications[J]. Materials Science Reports, 1993, 9(7-8):251-310.
[26] Lee B J, Baskes M. Second nearest-neighbor modified embedded-atom-method potential[J]. Physical Review B, 2000, 62(13):8564.
[27] Tchipev N, Seckler S, Heinen M, Vrabec J, Gratl F, Horsch M, Bernreuther M, Glass C W, Niethammer C, Hammer N, et al. TweTriS:Twenty trillion-atom simulation[J]. The International Journal of High Performance Computing Applications, 2019, 33(5):838-854.
[28] Behler J, Parrinello M. Generalized neural-network representation of high-dimensional potential-energy surfaces[J]. Physical review letters, 2007, 98(14):146401.
[29] Schütt K T, Arbabzadah F, Chmiela S, Müller K R, Tkatchenko A. Quantum-chemical insights from deep tensor neural networks[J]. Nature communications, 2017, 8:13890.
[30] Bartók A P, Payne M C, Kondor R, Csányi G. Gaussian approximation potentials:The accuracy of quantum mechanics, without the electrons[J]. Physical review letters, 2010, 104(13):136403.
[31] Rupp M, Tkatchenko A, Müller K R, Lilienfeld O A V. Fast and accurate modeling of molecular atomization energies with machine learning[J]. Physical review letters, 2012, 108(5):058301.
[32] Shapeev A V. Moment tensor potentials:A class of systematically improvable interatomic potentials[J]. Multiscale Modeling & Simulation, 2016, 14(3):1153-1173.
[33] Smith J S, Isayev O, Roitberg A E. ANI-1:an extensible neural network potential with DFT accuracy at force field computational cost[J]. Chemical Science, 2017, 8(4):3192-3203.
[34] Chmiela S, Tkatchenko A, Sauceda H E, Poltavsky I, Schütt K T, Müller K R. Machine learning of accurate energy-conserving molecular force fields[J]. Science Advances, 2017, 3(5):e1603015.
[35] Chmiela S, Sauceda H E, Müller K R, Tkatchenko A. Towards exact molecular dynamics simulations with machine-learned force fields[J]. Nature communications, 2018, 9(1):1-10.
[36] Yao K, Herr J E, Toth D W, Mckintyre R, Parkhill J. The TensorMol-0.1 model chemistry:a neural network augmented with long-range physics[J]. Chemical science, 2018, 9(8):2261-2269.
[37] Han J, Zhang L, Car R, et al. Deep Potential:A General Representation of a Many-Body Potential Energy Surface[J]. Communications in Computational Physics, 2018, 23(3).
[38] Zhang L, Han J, Wang H, Car R, Weinan E. Deep potential molecular dynamics:a scalable model with the accuracy of quantum mechanics[J]. Physical review letters, 2018, 120(14):143001.
[39] Zhang L, Han J, Wang H, Saidi W A, Car R, Weinan E. End-to-end symmetry preserving inter-atomic potential energy model for finite and extended systems[J]. Advances in Neural Information Processing Systems, 2018, 2018:4436-4446.
[40] Lubbers N, Smith J S, Barros K. Hierarchical modeling of molecular energies using a deep neural network[J]. The Journal of Chemical Physics, 2018, 148(24):241715.
[41] Unke O T, Meuwly M. PhysNet:a neural network for predicting energies, forces, dipole moments and partial charges[J]. Journal of chemical theory and computation, 2019.
[42] Bartok A P, Kermode J, Bernstein N, Csanyi G. Machine learning a general purpose interatomic potential for silicon[J]. arXiv preprint arXiv:1805.01568, 2018.
[43] Podryabinkin E V, Shapeev A V. Active learning of linearly parametrized interatomic potentials[J]. Computational Materials Science, 2017, 140:171-180.
[44] Smith J S, Nebgen B, Lubbers N, Isayev O, Roitberg A E. Less is more:Sampling chemical space with active learning[J]. The Journal of Chemical Physics, 2018, 148(24):241733.
[45] Uteva E, Graham R S, Wilkinson R D, Wheatley R J. Active learning in Gaussian process interpolation of potential energy surfaces[J]. The Journal of chemical physics, 2018, 149(17):174114.
[46] Gubaev K, Podryabinkin E V, Shapeev A V. Machine learning of molecular properties:Locality and active learning[J]. The Journal of Chemical Physics, 2018, 148(24):241727.
[47] Zhang L, Lin D Y, Wang H, Car R, Weinan E. Active learning of uniformly accurate interatomic potentials for materials simulation[J]. Physical Review Materials, 2019, 3(2):023804.
[48] Schran C, Behler J, Marx D. Automated Fitting of Neural Network Potentials at Coupled Cluster Accuracy:Protonated Water Clusters as Testing Ground[J]. Journal of chemical theory and computation, 2019.
[49] Jinnouchi R, Lahnsteiner J, Karsai F, Kresse G, Bokdam M. Phase Transitions of Hybrid Perovskites Simulated by Machine-Learning Force Fields Trained on the Fly with Bayesian Inference[J]. Physical Review Letters, 2019, 122(22):225701.
[50] Zhang Y, Wang H, Chen W, Zeng J, Zhang L, Wang H, Weinan E. DP-GEN:A concurrent learning platform for the generation of reliable deep learning based potential energy models[J]. Computer Physics Communications, 2020, 253:107206.
[51] Lin Q, Zhang L, Zhang Y, Jiang B. Searching Configurations in Uncertainty Space:Active Learning of High-Dimensional Neural Network Reactive Potentials[J]. Journal of Chemical Theory and Computation, 2021.
[52] Imbalzano G, Zhuang Y, Kapil V, Rossi K, Engel E A, Grasselli F, Ceriotti M. Uncertainty estimation for molecular dynamics and sampling[J]. The Journal of Chemical Physics, 2021, 154(7):074102.
[53] Gartner T E, Zhang L, Piaggi P M, Car R, Panagiotopoulos A Z, Debenedetti P G. Signatures of a liquid-liquid transition in an ab initio deep neural network model for water[J]. Proceedings of the National Academy of Sciences, 2020, 117(42):26040-26046.
[54] Zhang L, Wang H, Car R, Weinan E. Phase Diagram of a Deep Potential Water Model[J]. Physical Review Letters, 2021, 126(23):236001.
[55] Artrith N, Morawietz T, Behler J. High-dimensional neural-network potentials for multicomponent systems:Applications to zinc oxide[J]. Physical Review B, 2011, 83(15):153101.
[56] Bereau T, Andrienko D, von Lilienfeld O A. Transferable atomic multipole machine learning models for small organic molecules[J]. Journal of Chemical Theory and Computation, 2015.
[57] Bereau T, Jr R A D, Tkatchenko A, Lilienfeld O A V. Non-covalent interactions across organic and biological subsets of chemical space:Physics-based potentials parametrized from machine learning[J]. The Journal of Chemical Physics, 2018, 148(24):241706.
[58] Nebgen B, Lubbers N, Smith J S, Sifain A, Lokhov A, Isayev O, Roitberg A, Barros K, Tretiak S. Transferable Molecular Charge Assignment Using Deep Neural Networks[J]. arXiv preprint arXiv:1803.04395, 2018.
[59] Sifain A E, Lubbers N, Nebgen B T, Smith J S, Lokhov A Y, Isayev O, Roitberg A E, Barros K, Tretiak S. Discovering a transferable charge assignment model using machine learning[J]. The journal of physical chemistry letters, 2018, 9(16):4495-4501.
[60] Ghasemi S A, Hofstetter A, Saha S, Goedecker S. Interatomic potentials for ionic systems with density functional accuracy based on charge densities obtained by a neural network[J]. Physical Review B, 2015, 92(4):045131.
[61] Ko T W, Finkler J A, Goedecker S, Behler J. A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer[J]. Nature communications, 2021, 12(1):1-11.
[62] Ko T W, Finkler J A, Goedecker S, Behler J. General-Purpose Machine Learning Potentials Capturing Nonlocal Charge Transfer[J]. Accounts of Chemical Research, 2021, 54(4):808-817.
[63] Grisafi A, Ceriotti M. Incorporating long-range physics in atomic-scale machine learning[J]. The Journal of chemical physics, 2019, 151(20):204105.
[64] Grisafi A, Nigam J, Ceriotti M. Multi-scale approach for the prediction of atomic scale properties[J]. Chemical Science, 2021, 12(6):2078-2090.
[65] Mahoney M W, Drineas P. CUR matrix decompositions for improved data analysis[J]. Proceedings of the National Academy of Sciences, 2009, 106(3):697-702.
[66] Imbalzano G, Anelli A, Giofré D, Klees S, Behler J, Ceriotti M. Automatic selection of atom-ic fingerprints and reference configurations for machine-learning potentials[J]. The Journal of chemical physics, 2018, 148(24):241730.
[67] Krizhevsky A, Sutskever I, Hinton G E. Imagenet classification with deep convolutional neural networks[J]. Advances in neural information processing systems, 2012, 25:1097-1105.
[68] Wolf T, Chaumond J, Debut L, Sanh V, Delangue C, Moi A, Cistac P, Funtowicz M, Davison J, Shleifer S, et al. Transformers:State-of-the-art natural language processing[C]. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing:System Demonstrations, 2020, 38-45.
[69] Senior A W, Evans R, Jumper J, Kirkpatrick J, Sifre L, Green T, Qin C, Žídek A, Nelson A W, Bridgland A, et al. Improved protein structure prediction using potentials from deep learning[J]. Nature, 2020, 1-5.
[70] LeCun Y, Bengio Y, Hinton G. Deep learning[J]. Nature, 2015, 521(7553):436-444.
[71] Hahnloser R H, Sarpeshkar R, Mahowald M A, Douglas R J, Seung H S. Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit[J]. Nature, 2000, 405(6789):947-951.
[72] He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition[C]. In Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, 770-778.
[73] Barron A R. Universal approximation bounds for superpositions of a sigmoidal function[J]. IEEE Transactions on Information theory, 1993, 39(3):930-945.
[74] Barron A R. Approximation and estimation bounds for artificial neural networks[J]. Machine learning, 1994, 14(1):115-133.
[75] Liang S, Srikant R. Why deep neural networks for function approximation?[J]. arXiv preprint arXiv:1610.04161, 2016.
[76] Telgarsky M. benefits of depth in neural networks[C]. In Conference on Learning Theory, 2016, 1517-1539.
[77] Yarotsky D. Error bounds for approximations with deep ReLU networks[J]. Neural Networks, 2017, 94:103-114.
[78] Lu J, Shen Z, Yang H, Zhang S. Deep network approximation for smooth functions[J]. arXiv preprint arXiv:2001.03040, 2020.
[79] E W, Ma C, Wu L. A priori estimates of the population risk for two-layer neural networks[J]. Communications in Mathematical Sciences, 2019, 17(5):1407-1425.
[80] E W, Ma C, Wu L. The Barron Space and the Flow-Induced Function Spaces for Neural Network Models[J]. Constructive Approximation, 2021, 1-38.
[81] Behler J. Perspective:Machine learning potentials for atomistic simulations[J]. The Journal of Chemical Physics, 2016, 145(17):170901.
[82] Hajinazar S, Shao J, Kolmogorov A N. Stratified construction of neural network based interatomic models for multicomponent materials[J]. Physical Review B, 2017, 95(1):014114.
[83] Singraber A, Behler J, Dellago C. Library-Based LAMMPS Implementation of High-Dimensional Neural Network Potentials[J]. Journal of chemical theory and computation, 2019, 15(3):1827-1840.
[84] Lee K, Yoo D, Jeong W, Han S. SIMPLE-NN:An efficient package for training and executing neural-network interatomic potentials[J]. Computer Physics Communications, 2019, 242:95-103.
[85] Himanen L, Jäger M O, Morooka E V, Canova F F, Ranawat Y S, Gao D Z, Rinke P, Foster A S. DScribe:Library of descriptors for machine learning in materials science[J]. Computer Physics Communications, 2020, 247:106949.
[86] Bartók A P, Kondor R, Csányi G. On representing chemical environments[J]. Physical Review B, 2013, 87(18):184115.
[87] Schütt K T, Sauceda H E, Kindermans P J, Tkatchenko A, Müller K R. SchNet-A deep learning architecture for molecules and materials[J]. The Journal of Chemical Physics, 2018, 148(24):241722.
[88] Schütt K, Kindermans P J, Felix H E S, Chmiela S, Tkatchenko A, Müller K R. SchNet:A continuous-filter convolutional neural network for modeling quantum interactions[C]. In Advances in Neural Information Processing Systems, 2017, 991-1001.
[89] Pozdnyakov S N, Willatt M J, Bartók A P, Ortner C, Csányi G, Ceriotti M. On the Completeness of Atomic Structure Representations[J]. arXiv preprint arXiv:2001.11696, 2020.
[90] Ruddigkeit L, Deursen R V, Blum L C, Reymond J L. Enumeration of 166 billion organic small molecules in the chemical universe database GDB-17[J]. Journal of chemical information and modeling, 2012, 52(11):2864-2875.
[91] Ramakrishnan R, Dral P O, Rupp M, Lilienfeld O A V. Quantum chemistry structures and properties of 134 kilo molecules[J]. Scientific data, 2014, 1(1):1-7.
[92] Deng J, Dong W, Socher R, Li L J, Li K, Fei-Fei L. Imagenet:A large-scale hierarchical image database[C]. In 2009 IEEE conference on computer vision and pattern recognition, Ieee, 2009, 248-255.
[93] Grisafi A, Wilkins D M, Csányi G, Ceriotti M. Symmetry-Adapted Machine Learning for Tensorial Properties of Atomistic Systems[J]. Physical Review Letters, 2018, 120(3):036002.
[94] Pereira F, de Sousa J A. Machine learning for the prediction of molecular dipole moments obtained by density functional theory[J]. Journal of cheminformatics, 2018, 10(1):43.
[95] Glick Z L, Koutsoukas A, Cheney D L, Sherrill C D. Cartesian message passing neural networks for directional properties:Fast and transferable atomic multipoles[J]. The Journal of Chemical Physics, 2021, 154(22):224103.
[96] Zhang L, Chen M, Wu X, Wang H, Weinan E, Car R. Deep neural network for the dielectric response of insulators[J]. Physical Review B, 2020, 102(4):041121.
[97] Wilkins D M, Grisafi A, Yang Y, Lao K U, DiStasio R A, Ceriotti M. Accurate molecular polarizabilities with coupled cluster theory and machine learning[J]. Proceedings of the National Academy of Sciences, 2019, 116(9):3401-3406.
[98] Raimbault N, Grisafi A, Ceriotti M, Rossi M. Using Gaussian process regression to simulate the vibrational Raman spectra of molecular crystals[J]. New Journal of Physics, 2019, 21(10):105001.
[99] Sommers G M, Andrade M F C, Zhang L, Wang H, Car R. Raman spectrum and polarizability of liquid water from deep neural networks[J]. Physical Chemistry Chemical Physics, 2020, 22(19):10592-10602.
[100] Brockherde F, Vogt L, Li L, Tuckerman M E, Burke K, Müller K R. Bypassing the Kohn-Sham equations with machine learning[J]. Nature communications, 2017, 8(1):872.
[101] Ryczko K, Strubbe D A, Tamblyn I. Deep learning and density-functional theory[J]. Physical Review A, 2019, 100(2):022512.
[102] Borda E J L, Samanta A. A practical approach to Hohenberg-Kohn maps based on many-body correlations:learning the electronic density[J]. arXiv preprint arXiv:2004.14442, 2020.
[103] Chandrasekaran A, Kamal D, Batra R, Kim C, Chen L, Ramprasad R. Solving the electronic structure problem with machine learning[J]. npj Computational Materials, 2019, 5(1):22.
[104] Zepeda-Núñez L, Chen Y, Zhang J, Jia W, Zhang L, Lin L. Deep Density:circumventing the Kohn-Sham equations via symmetry preserving neural networks[J]. arXiv preprint arXiv:1912.00775, 2019.
[105] Bogojeski M, Brockherde F, Vogt-Maranto L, Li L, Tuckerman M E, Burke K, Müller K R. Efficient prediction of 3D electron densities using machine learning[J]. arXiv preprint arXiv:1811.06255, 2018.
[106] Fabrizio A, Grisafi A, Meyer B, Ceriotti M, Corminboeuf C. Electron density learning of noncovalent systems[J]. Chemical science, 2019, 10(41):9424-9432.
[107] Zhang L, Han J, Wang H, Car R, E W. DeePCG:Constructing coarse-grained models via deep neural networks[J]. The Journal of chemical physics, 2018, 149(3):034101.
[108] Wang W, Gómez-Bombarelli R. Coarse-graining auto-encoders for molecular dynamics[J]. npj Computational Materials, 2019, 5(1):1-9.
[109] Wang J, Olsson S, Wehmeyer C, Pérez A, Charron N E, Fabritiis G D, Nóe F, Clementi C. Machine learning of coarse-grained molecular dynamics force fields[J]. ACS central science, 2019, 5(5):755-767.
[1] 崔俊芝, 余翌帆. 基于分子动力学模拟的金属构件的弹-塑性分解方法[J]. 计算数学, 2020, 42(3): 279-297.
阅读次数
全文


摘要