Accelerating Superconductivity Discovery through Deep Learning

Jun 25, 2025
Figure. BETE-NET architecture for predicting the Eliashberg function. First, the model converts a crystal structure into a graph with nodes representing atoms. The nodes are embedded with a one-hot-encoded vector of the constituent atoms' atomic number multiplied by the atomic mass. The edges, connecting atoms within a 4 Å; radius, embed the interatomic distance. A sequence of convolutions and gated-block operations are then applied to the graph. Finally, a pooling operation is applied to the node embedding, yielding the prediction of α2F(ω). This is the base network for the crystal-structure-only variant of BETE-NET.
Figure. BETE-NET architecture for predicting the Eliashberg function. First, the model converts a crystal structure into a graph with nodes representing atoms. The nodes are embedded with a one-hot-encoded vector of the constituent atoms' atomic number multiplied by the atomic mass. The edges, connecting atoms within a 4 Å; radius, embed the interatomic distance. A sequence of convolutions and gated-block operations are then applied to the graph. Finally, a pooling operation is applied to the node embedding, yielding the prediction of α2F(ω). This is the base network for the crystal-structure-only variant of BETE-NET.

Integrating deep learning with the search for new electron-phonon superconductors represents a burgeoning field of research, where the primary challenge lies in the computational intensity of calculating the electron-phonon spectral function, α2F(ω), the essential ingredient of Midgal- Eliashbergtheory of superconductivity. To overcome this challenge, a two-step approach was adopted. First, α2F(ω) was computed for 818 dynamically stable materials. A deep-learning model was then trained to predict α2F(ω).

Domain knowledge of the site-projected phonon density of states was incorporated to impose inductive bias into the model’s node attributes and enhance predictions. The practical application of this model was illustrated in high-throughput screening for high-Tc materials. The model demonstrates an average precision nearly five times higher than random screening, highlighting the potential of machine learning (ML) in accelerating superconductor discovery. This neural net accelerates the search for high-Tc superconductors while setting a precedent for applying ML in materials discovery, particularly when data is limited.

Authors

Peter Hirschfeld and Richard Hennig (University of Florida)

Additional Materials

Designing Materials to Revolutionize and Engineer our Future (DMREF)