marcel.science / glp
Stress and heat flux via automatic differentiation
Hello! 👋 This website is a supplement to our preprint (see below) discussing how to compute derivative quantities for machine-learning potentials using automatic differentiation. It is accompanied by an example implementation in jax
, which we call glp
.
Paper
- Title: Stress and heat flux via automatic differentiation
- Preprint: arxiv:2305.01401 (2023)
- Abstract: Machine-learning potentials provide computationally efficient and accurate approximations of the Born-Oppenheimer potential energy surface. This potential determines many materials properties and simulation techniques usually require its gradients, in particular forces and stress for molecular dynamics, and heat flux for thermal transport properties. Recently developed potentials feature high body order and can include equivariant semi-local interactions through message-passing mechanisms. Due to their complex functional forms, they rely on automatic differentiation (AD), overcoming the need for manual implementations or finite-difference schemes to evaluate gradients. This study demonstrates a unified AD approach to obtain forces, stress, and heat flux for such potentials, and provides a model-independent implementation. The method is tested on the Lennard-Jones potential, and then applied to predict cohesive properties and thermal conductivity of tin selenide using an equivariant message-passing neural network potential.
Please contact Marcel for questions about the paper. You can also say hi on twitter: 🐦 marceldotsci, 🐦 frankthorben, and 🐦 flokno_phys.
Results & Data
glp-archive
: Supporting data repository for the paper, containing software, used models, input files for production runs, resulting data, plotting scripts, as well as all first-principles calculations, including those for training data.
Software
glp
:jax
-based implementation of results in the paper. Implements forces, stress, and heat flux for any potential with the function signature of mapping a graph to potential energies.gkx
: Tool for running Green-Kubo workflows withjax
, in particular the equilibrium molecular dynamics simulations required.
Summary
Machine-learning interatomic potentials (MLPs) try to approximate potential-energy surfaces, which can be computed with very expensive computational methods, in a more computationally efficient way: A few such expensive “first-principles” calculations are used as training data, and the trained models are then used to replace the first-principles methods. Such models are being developed rapidly, and often in machine learning frameworks like pytorch
, tensorflow
, and jax
, which support automatic differentiation (AD).
To be used in practice, the derivatives of MLPs, such as forces, stress, and heat flux, are often required. They can be obtained with AD in principle, but in practice, doing this is not always straightforward. This work provides a general overview of how to do it: We note that from the perspective of AD, many recently developed models can be seen as a simple mapping between a graph structure and a set of potential energies, and once we understand this abstract description, we can write down and implement unified formulations. Using jax
, we do this in the glp
package.
We then verify the implementation by comparing with the Lennard-Jones potential, where analytical derivatives are easy to get, and then test our framework with a state-of-the-art neural network potential, so3krates, on tin selenide, calculating its thermal conductivity and cohesive properties.