AI Method Radically Speeds Up Predictions of Thermal Properties of Materials

AI Method Radically Speeds Up Predictions of Thermal Properties of Materials

A new method could help models predict a material’s thermal properties, for example by revealing the dynamics of atoms in crystals, as shown here. Credit: Massachusetts Institute of Technology

It is estimated that around 70% of the energy produced worldwide ends up as waste heat.

If scientists could better predict how heat moves through semiconductors and insulators, they could design more efficient energy-generating systems. However, the thermal properties of materials can be extremely difficult to model.

The problem comes from phonons, subatomic particles that carry heat. Some thermal properties of a material depend on a measure called the phonon dispersion relation, which can be extremely difficult to obtain, let alone use in designing a system.

A team of researchers from MIT and other universities has taken on this challenge by rethinking the problem from the ground up. The result of their work is a new machine learning framework that can predict phonon dispersion relations up to 1,000 times faster than other AI-based techniques, with comparable or even better accuracy. Compared to more traditional, non-AI approaches, this system could be 1 million times faster.

The method could help engineers design more efficient and effective power generation systems. It could also be used to develop more efficient microelectronics, as heat management remains a major obstacle to accelerating electronics.

“Phonons are responsible for thermal loss, but obtaining their properties is notoriously difficult, both computationally and experimentally,” says Mingda Li, associate professor of nuclear science and engineering and lead author of a paper on the technique.

Li is joined on the paper by co-lead authors Ryotaro Okabe, a graduate student in chemistry; and Abhijatmedhi Chotrattanapituk, a graduate student in electrical engineering and computer science; Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering and Computer Science at MIT; and others from MIT, Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California, Santa Barbara, and Oak Ridge National Laboratory.

The research is published In Nature Computer Science.

Predicting phonons

Heat-carrying phonons are difficult to predict because they have an extremely wide frequency range and the particles interact and move at different speeds.

The phonon dispersion relation of a material is the relationship between the energy and momentum of phonons in its crystal structure. For years, researchers have tried to predict phonon dispersion relations using machine learning, but the high-precision calculations involved are so numerous that the models get bogged down.

“If you have 100 processors and a few weeks, you can probably calculate the phonon dispersion relation for a material. The whole community is really interested in finding a more efficient way to do this,” Okabe says.

The machine learning models that scientists often use for these calculations are known as graph neural networks (GNNs). A GNN converts the atomic structure of a material into a crystal graph consisting of multiple nodes, which represent atoms, connected by edges, which represent the interatomic bonding between atoms.

Although geometric neural networks work well for computing many quantities, such as magnetization or electric polarization, they are not flexible enough to efficiently predict a very high-dimensional quantity such as the phonon dispersion relation. Since phonons can move around atoms on the X, Y, and Z axes, their momentum space is difficult to model with a fixed graph structure.

To achieve the flexibility they needed, Li and his collaborators designed virtual nodes.

They create what they call a virtual node graph neural network (VGNN) by adding a series of flexible virtual nodes to the fixed crystal structure to represent phonons. The virtual nodes allow the output of the neural network to vary in size, so that it is not limited by the fixed crystal structure.

Virtual nodes are connected to the graph in such a way that they can only receive messages from real nodes. Although virtual nodes are updated as the model updates real nodes during computation, they do not affect the accuracy of the model.

“Our encoding method is very efficient. You just have to generate a few extra nodes in your GNN. The physical location doesn’t matter, and the real nodes don’t even know that the virtual nodes are there,” Chotrattanapituk says.

Eliminate complexity

Because it has virtual nodes to represent phonons, VGNN can skip many complex calculations when estimating phonon dispersion relations, making the method more efficient than a standard GNN.

The researchers proposed three different versions of VGNN of increasing complexity. Each can be used to predict phonons directly from the atomic coordinates of a material.

Thanks to their flexible approach to rapidly model high-dimensional properties, they can use it to estimate phonon dispersion relations in alloy systems. These complex combinations of metals and nonmetals are particularly difficult to model by traditional approaches.

The researchers also found that VGNNs offered slightly better accuracy in predicting a material’s heat capacity. In some cases, prediction errors were two orders of magnitude lower with their technique.

A VGNN could be used to calculate phonon dispersion relations for a few thousand materials in just a few seconds with a personal computer, Li says.

This efficiency could allow scientists to explore a wider space when looking for materials with certain thermal properties, such as superior heat storage, energy conversion or superconductivity.

Moreover, the virtual knot technique is not exclusive to phonons and could also be used to predict difficult optical and magnetic properties.

In the future, the researchers want to refine the technique so that the virtual nodes have greater sensitivity to capture small changes that can affect the structure of phonons.

“Researchers have gotten used to using graph nodes to represent atoms, but we can rethink that. Graph nodes can be anything. And virtual nodes are a very generic approach that you can use to predict a lot of high-dimensional quantities,” Li says.

“The authors’ innovative approach significantly augments the graphical neural network description of solids by incorporating key physics-informed elements via virtual nodes, for example, informing wavevector-dependent band structures and dynamic matrices,” says Olivier Delaire, associate professor in the Thomas Lord Department of Mechanical Engineering and Materials Science at Duke University, who was not involved in this work.

“I find the level of acceleration in predicting complex phonon properties to be astonishing, several orders of magnitude faster than a state-of-the-art universal machine learning interatomic potential. Impressively, the advanced neural network captures fine features and obeys physical rules.

“There is great potential to extend the model to describe other important properties of materials: electronic, optical and magnetic spectra as well as band structures come to mind.”

More information:
Ryotaro Okabe et al, Virtual Node Graph-Based Neural Network for Complete Phonon Prediction, Nature Computer Science (2024). DOI: 10.1038/s43588-024-00661-0

Provided by the Massachusetts Institute of Technology


This story is republished with kind permission from MIT News (web.mit.edu/newsoffice/), a popular site that covers MIT research, innovation, and education news.

Quote:AI method radically speeds up predictions of materials’ thermal properties (July 17, 2024) Retrieved July 17, 2024 from https://phys.org/news/2024-07-ai-method-radically-materials-thermal.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.

Leave a Comment