TY  - JOUR
SP  - 1048
VL  - 3
JF  - Digital Discovery
A1  - Anselmi, Marco
A1  - Slabaugh, Greg
A1  - Crespo-Otero, Rachel
A1  - Di Tommaso, Devis
PB  - ROYAL SOC CHEMISTRY
Y1  - 2024/05/01/
UR  - http://dx.doi.org/10.1039/d4dd00014e
ID  - discovery10196206
N2  - Graph Neural Networks (GNNs) have revolutionized material property prediction by learning directly from the structural information of molecules and materials. However, conventional GNN models rely solely on local atomic interactions, such as bond lengths and angles, neglecting crucial long-range electrostatic forces that affect certain properties. To address this, we introduce the Molecular Graph Transformer (MGT), a novel GNN architecture that combines local attention mechanisms with message passing on both bond graphs and their line graphs, explicitly capturing long-range interactions. Benchmarking on MatBench and Quantum MOF (QMOF) datasets demonstrates that MGT's improved understanding of electrostatic interactions significantly enhances the prediction accuracy of properties like exfoliation energy and refractive index, while maintaining state-of-the-art performance on all other properties. This breakthrough paves the way for the development of highly accurate and efficient materials design tools across diverse applications.
N1  - This article is licensed under a Creative Commons Attribution 3.0 Unported Licence.
KW  - Science & Technology
KW  -  Physical Sciences
KW  -  Technology
KW  -  Chemistry
KW  -  Multidisciplinary
KW  -  Computer Science
KW  -  Interdisciplinary Applications
KW  -  Chemistry
KW  -  Computer Science
KW  -  PREDICTION
KW  -  FRAMEWORK
AV  - public
TI  - Molecular graph transformer: stepping beyond ALIGNN into long-range interactions
SN  - 2635-098X
EP  - 1057
IS  - 5
ER  -