marcel.science / gknet
Heat flux for semilocal machine-learning potentials

Hello! šŸ‘‹ This website is a supplement to our paper (see below), which extends the Green-Kubo method to semi-local machine-learning potentials. Here, we present an overview of the data, software, and other supplementary materials created for this project.

NOTE The methods developed in this work have recently been implemented with jax: The glp package implements the heat flux (and other derivative quantities) for local and semi-local potentials, and gkx puts it all together to run the full Green-Kubo workflows.

Paper

Please contact Marcel or Florian for questions about the paper. You can also say hi on twitter: šŸ¦ marceldotsci and šŸ¦ flokno_phys.

Results & Data

Software

The gknet-archive contains the implementations of all heat flux variations investigated in the paper. However, this infrastructure is based on the previous version of schnetpack, and it has been decided to not develop it further. Instead, a more general implementation, which no longer relies on a particular potential, is in development and will be released soon.

Summary

The thermal conductivity describes to what extent heat can be transported through materials. Predicting it for new materials through simulation, rather than experiment, is of great research interest. This way, we can avoid having to synthesise new materials to check their thermal properties, and can help explain the mechanisms involved in thermal transport.

The Green-Kubo (GK) method is an approach to compute the thermal conductivity of materials. It is based on the idea of observing fluctuations of the energy density during molecular dynamics simulations, which are quantified by the heat flux. While it is possible to run the molecular dynamics simulations using only DFT, this is computationally very expensive and limits the applicability of this method. Machine-learning potentials, which approximate the potential-energy surface of DFT, promise to combine the accuracy of DFT with linear-scaling computational cost.

Many different machine-learning potentials have been proposed, and some have already been used for the GK method. Recently, models based on message-passing neural networks have become a standard method. In this work, we provide the missing piece to be able to use such models for the GK method: We derive, implement, and test a formulation of the heat flux that applies to this type of potential.

The heat flux is a challenging quantity to define and implement correctly, for both conceptual and technical reasons. In this paper, we go back to the fundamental definition of the heat flux and re-derive a standard expression, and then provide different ways to implement the result with automatic differentiation (AD). One particular difficulty is dealing with the semi-local interactions that are caused by message-passing. In such systems, atoms can interact outside of their local atomic environments, which is not expected in standard approaches to the heat flux. Trying to adapt these approaches leads to quadratic scaling of computational cost.

We show that an alternative way to compute the heat flux, which is based on ā€œunfoldingā€ the simulation cell, rather than using periodic boundary conditions, applies to semi-local interactions and can be implemented efficiently with AD. This is then verified by investigating zirconium dioxide (zirconia) with the SchNet neural network.