/ glp
Stress and heat flux via automatic differentiation

Hello! 👋 This website is a supplement to our preprint (see below) discussing how to compute derivative quantities for machine-learning potentials using automatic differentiation. It is accompanied by an example implementation in jax, which we call glp.


Please contact Marcel for questions about the paper. You can also say hi on twitter: 🐦 marceldotsci, 🐦 frankthorben, and 🐦 flokno_phys.

Results & Data



Machine-learning interatomic potentials (MLPs) try to approximate potential-energy surfaces, which can be computed with very expensive computational methods, in a more computationally efficient way: A few such expensive “first-principles” calculations are used as training data, and the trained models are then used to replace the first-principles methods. Such models are being developed rapidly, and often in machine learning frameworks like pytorch, tensorflow, and jax, which support automatic differentiation (AD).

To be used in practice, the derivatives of MLPs, such as forces, stress, and heat flux, are often required. They can be obtained with AD in principle, but in practice, doing this is not always straightforward. This work provides a general overview of how to do it: We note that from the perspective of AD, many recently developed models can be seen as a simple mapping between a graph structure and a set of potential energies, and once we understand this abstract description, we can write down and implement unified formulations. Using jax, we do this in the glp package.

We then verify the implementation by comparing with the Lennard-Jones potential, where analytical derivatives are easy to get, and then test our framework with a state-of-the-art neural network potential, so3krates, on tin selenide, calculating its thermal conductivity and cohesive properties.