marcel.science / home
Hello! 👋 I’m Marcel, a postdoc in the COSMO Lab at EPFL in Lausanne, Switzerland. Previously, I was a doctoral student at the Fritz Haber Institute (in the NOMAD Laboratory) and TU Berlin (in the Machine Learning group).
My current research focuses on the design, implementation, and application of machine-learning interatomic potentials, in particular for materials science. I've worked on a few different aspects of this overall topic, starting with representations of molecules and materials (repbench), through thermal transport with message-passing neural networks (gknet, glp), to the investigation of the role of physical priors like equivariance (eqt), and derivative-based forces (nc). Recently, I've been focusing on including long-range effects in these models (lorem). I'm also currently dabbling in closing the loop to predict experimental observables... ⚗️
Most of what I do is build computer programs that simulate the movement of atoms: The kind of simulations chemists, physicists, and materials scientists use to understand and design everything from batteries to thermal insulators to new drugs. The difficulty here is this: You can either solve the underlying quantum mechanics directly, which is very accurate but also very expensive (even tiny systems for short times need a supercomputer), or you can use a hand-crafted approximation (a "force field") that runs fast but encodes a lot of guesses about how atoms will behave. Neither is great if you actually want to predict, say, how heat flows through a real material at a real temperature, or if you want to look at materials that have never been studied before. I work on using machine learning to bridge these extremes: Train a model on a relatively small set of expensive quantum calculations, and then let it stand in for them. This is much cheaper, almost as accurate, and (crucially) flexible enough to keep getting better as you feed it more data. My concrete everyday work involves mostly prototyping new models, arguing with people about derivatives, and trying to write software that is somewhere in the sweet spot between "well-engineered/efficient enough to work" and "flexible/hackable enough for research".
Read this / .
Here are some other places on the internet where you can find me:
- Twitter or Bluesky for banter and updates,
- notes.marcel.science for some technical notes,
- Gitlab and Github for code,
- Google Scholar or ORCID for official scientific activity,
- marcel.computer for my personal (online) life!
If you have any questions, or just want to say hi, you should be able to reach me on twitter, or at mail@marcel.science.
Thanks for stopping by! 🚀
Publications
- Title: Learning long-range representations with equivariant messages
- Published: Transactions on Machine Learning Research (TMLR 2026)
- Preprint: arxiv:2507.19382 (2025)
- Title: Simultaneous learning of static and dynamic charges
- Preprint: arxiv:2601.03656 (2026)
- Title: Metatensor and metatomic: foundational libraries for interoperable atomistic machine learning
- Preprint: arxiv:2508.15704 (2025)
- Title: Roadmap on advancements of the FHI-aims software package
- Preprint: arxiv:2505.00125 (2025)
- Title: The dark side of the forces: assessing non-conservative force models for atomistic machine learning
- Published: Oral presentation at the Forty-second International Conference on Machine Learning (ICML 2025)
- Preprint: arxiv:2412.11569 (2024)
- Title: Fast and flexible long-range models for atomistic machine learning
- Published: J. Chem. Phys. 162, 142501 (2025)
- DOI: 10.1063/5.0251713
- Preprint: arxiv:2412.03281 (2024)
- Code: torch-pme and jax-pme
- Title: Probing the effects of broken symmetries in machine learning
- Published: Mach. Learn.: Sci. Technol. 5 04LT01 (2024)
- DOI: 10.1088/2632-2153/ad86a0
- Preprint: arxiv:2406.17747 (2024)
- Title: Machine learning for atomistic modeling: representations and thermal transport (Dissertation)
- DOI: 10.14279/depositonce-18647 (2023)
- Title: Stress and heat flux via automatic differentiation
- Published: J. Chem. Phys. 159, 174105 (2023)
- DOI: 10.1063/5.0155760
- Preprint: arxiv:2305.01401 (2023)
- Code: glp
- Title: Heat flux for semilocal machine-learning potentials
- Published: Phys. Rev. B 108, L100302 (2023)
- DOI: 10.1103/physrevb.108.l100302
- Preprint: arxiv:2303.14434 (2023)
- Title: Representations of molecules and materials for interpolation of quantum-mechanical simulations via machine learning
- Published: npj Comput. Mater. 8, 41 (2022)
- DOI: 10.1038/s41524-022-00721-x
- Preprint: arxiv.org/abs/2003.12081 (2020)
Note that * indicates shared first authorship.
Projects
These are the projects I’m working on (or have worked on):
lorem(2025-now): Developing a long-range message passing scheme and model,marathon(2025-now): Modular training infrastructure for machine-learning interatomic potentials injax,torch-pme(2024-now) &jax-pme(2024-now): Fast, modular, and differentiable implementation of long-range interactions,eqt(2024) &nc(2024-now): Testing to what extent breaking physical constraints such as rotational invariance (eqt) or energy conservation (nc) is useful in machine-learning models,gkx(2023-now): Green-Kubo molecular dynamics injax,glp(2023-now):jax-based implementation of stress and heat flux for graph-based machine-learning potentials,gknet(2020-2023): Adaptation of the Green-Kubo method for computing thermal conductivities to message-passing graph neural networks, and application to zirconium dioxide, a strongly anharmonic material,cmlkit(2018-2020): Minimalist, modular framework to specify, optimise, and evaluate machine learning models for condensed matter physics, developed largely for therepbenchproject,repbench(2018-2022): Review and benchmark of different representations of molecules and materials, evaluating them as features for regression,doctor(2017-2023): My doctoral dissertation.
I've also written some tutorials for the NOMAD Analytics Toolkit:
cmlkit: Toolkit for Machine Learning in Computational Condensed Matter Physics and Quantum Chemistry, which covers the basics of thecmlkitframework, in a hopefully beginner-friendly way.krr4mat:Kernel Ridge Regression for Materials Property Prediction: A Tutorial Introduction, which gives a short, and very pragmatic, introduction to kernel ridge regression.
During my PhD, I also started the (now defunct) Fritz Sessions, an intermittent series of lectures at the Fritz Haber Institute, covering topics related to the future.