Skip to yearly menu bar Skip to main content


Poster

Crystalformer: Infinitely Connected Attention for Periodic Structure Encoding

Tatsunori Taniai · Ryo Igarashi · Yuta Suzuki · Naoya Chiba · Kotaro Saito · Yoshitaka Ushiku · Kanta Ono

Halle B
[ ] [ Project Page ]
Thu 9 May 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

Predicting physical properties of materials from their crystal structures is a fundamental problem in materials science. In peripheral areas such as the prediction of molecular properties, fully connected attention networks have been shown to be successful. However, unlike these finite atom arrangements, crystal structures are infinitely repeating, periodic arrangements of atoms, whose fully connected attention results in infinitely connected attention. In this work, we show that this infinitely connected attention can lead to a computationally tractable and physically interpretable formulation. We then propose a simple yet effective transformer-based encoder architecture for crystal structures called Crystalformer. Compared with an existing transformer-based model, the proposed model requires only 38% of number of parameters per attention block. Despite the architectural simplicity, the proposed method outperforms state-of-the-art methods for various property regression tasks on the Materials Project and JARVIS-DFT datasets.

Chat is not available.