Learning Molecular Dynamics with Simple Language Model built upon Long Short-Term Memory Neural Network.
2020
Recurrent neural networks (RNNs) have led to breakthroughs in natural language processing and speech recognition, wherein hundreds of millions of people use such tools on a daily basis through smartphones, email servers and other avenues. In this work, we show such RNNs, specifically Long Short-Term Memory (LSTM) neural networks can also be applied to capturing the temporal evolution of typical trajectories arising in chemical and biological physics. Specifically, we use a character-level language model based on LSTM. This learns a probabilistic model from 1-dimensional stochastic trajectories generated from molecular dynamics simulations of a higher dimensional system. We show that the model can not only capture the Boltzmann statistics of the system but it also reproduce kinetics at a large spectrum of timescales. We demonstrate how the embedding layer, introduced originally for representing the contextual meaning of words or characters, exhibits here a nontrivial connectivity between different metastable states in the underlying physical system. We demonstrate the reliability of our model and interpretations through different benchmark systems. We anticipate that our work represents a stepping stone in the use of RNNs for modeling and predicting dynamics of complex stochastic molecular systems.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
49
References
2
Citations
NaN
KQI