Back to Explorer
Research PaperResearchia:202604.03012[Artificial Intelligence > AI]

Crystalite: A Lightweight Transformer for Efficient Crystal Modeling

Tin Hadži Veljković

Abstract

Generative models for crystalline materials often rely on equivariant graph neural networks, which capture geometric structure well but are costly to train and slow to sample. We present Crystalite, a lightweight diffusion Transformer for crystal modeling built around two simple inductive biases. The first is Subatomic Tokenization, a compact chemically structured atom representation that replaces high-dimensional one-hot encodings and is better suited to continuous diffusion. The second is the Geometry Enhancement Module (GEM), which injects periodic minimum-image pair geometry directly into attention through additive geometric biases. Together, these components preserve the simplicity and efficiency of a standard Transformer while making it better matched to the structure of crystalline materials. Crystalite achieves state-of-the-art results on crystal structure prediction benchmarks, and de novo generation performance, attaining the best S.U.N. discovery score among the evaluated baselines while sampling substantially faster than geometry-heavy alternatives.


Source: arXiv:2604.02270v1 - http://arxiv.org/abs/2604.02270v1 PDF: https://arxiv.org/pdf/2604.02270v1 Original Link: http://arxiv.org/abs/2604.02270v1

Submission:4/3/2026
Comments:0 comments
Subjects:AI; Artificial Intelligence
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Crystalite: A Lightweight Transformer for Efficient Crystal Modeling | Researchia