In-Person Poster presentation / poster accept
A Self-Attention Ansatz for Ab-initio Quantum Chemistry
Ingrid von Glehn · James Spencer · David Pfau
MH1-2-3-4 #110
Keywords: [ machine learning for physics ] [ quantum physics ] [ self-generative learning ] [ machine learning for molecules ] [ machine learning for chemistry ] [ Machine learning for science ] [ monte carlo ] [ transformers ] [ chemistry ] [ mcmc ] [ attention ] [ Machine Learning for Sciences ]
We present a novel neural network architecture using self-attention, the Wavefunction Transformer (PsiFormer), which can be used as an approximation (or "Ansatz") for solving the many-electron Schrödinger equation, the fundamental equation for quantum chemistry and material science. This equation can be solved from first principles, requiring no external training data. In recent years, deep neural networks like the FermiNet and PauliNet have been used to significantly improve the accuracy of these first-principle calculations, but they lack an attention-like mechanism for gating interactions between electrons. Here we show that the PsiFormer can be used as a drop-in replacement for these other neural networks, often dramatically improving the accuracy of the calculations. On larger molecules especially, the ground state energy can be improved by dozens of kcal/mol, a qualitative leap over previous methods. This demonstrates that self-attention networks can learn complex quantum mechanical correlations between electrons, and are a promising route to reaching unprecedented accuracy in chemical calculations on larger systems.