Friday, May 3, 2024

Trending Topics

HomeInnovationMint Explainer: The implications of AI creating new materials in seconds

Mint Explainer: The implications of AI creating new materials in seconds

spot_img

Using artificial intelligence models to create new materials is not new. Researchers typically use existing materials that are stable and substitute elements in their molecular structure to create new ones. For instance, computational approaches led by the and other groups have helped develop 28,000 new materials till date.

But this is an expensive and time-consuming process, and researchers may find it difficult to develop radically different structures since they mostly work with existing materials. Recently, though, Google unit DeepMind Technologies’ AI tool, called Graph Networks for Materials Exploration (GNoME), , of which 380,000 are stable materials. This is significant progress, as it could help researchers develop greener technologies such as more efficient batteries for electric cars, photovoltaics, superconductors, and more efficient computing.

GNoME is a deep-learning tool that predicts the stability of new materials, thus increasing the speed and efficiency of discovery and allowing researchers to create materials faster and at scale. Google says its new discovery is “equivalent to nearly 800 years’ worth of knowledge”. GNoME’s predictions are accessible to scientists around the world.

A team of researchers at the Lawrence Berkeley National Laboratory, in partnership with DeepMind, has published a revealing how AI predictions can be leveraged for autonomous material synthesis. The lab uses machine-learning and robotic arms to create new materials. How’s all this scaling up? Other researchers have independently created 736 of GNoME’s new materials in their labs, according to Google.

The AI tool has identified 52,000 new layered compounds similar to graphene, which can be used instead of silicon to make superconductors. Previously, about 1,000 such materials had been identified. GNoME also found 528 potential lithium-ion conductors, 25 times more than a , which could be used to improve the performance of rechargeable batteries.

Google describes GNoME as a graph neural network (GNN) model, wherein the input data takes the form of a graph. The AI model was originally trained with data on crystal structures and their stability, openly available through the Materials Project. Google continuously assessed the performance of its model using computational techniques known as Density Functional Theory (DFT)–used in physics, chemistry and materials science to understand the structures of atoms–to assess the stability of crystals.

It also used a training process called ‘active learning’ that boosted GNoME’s performance. GNoME would generate predictions for the structures of novel, stable crystals, which were then tested using DFT. The resulting training data was then fed back into the training model.

“Our research boosted the discovery rate of materials stability prediction from around 50% to 80% based on MatBench Discovery, an external benchmark set by previous state-of-the-art models,” Google said in a . “We also managed to scale up the efficiency of our model by improving the discovery rate from under 10% to over 80%–such efficiency increases could have significant impact on how much compute is required per discovery. ” That said, the AI models do have limitations.

Berkeley’s A-Lab, for instance, failed to make 17 of the 58 materials it targeted. Some of the materials needed to be heated to higher temperatures or had to be ground better, which are standard steps in labs that are outside AI’s current purview. Experts also point out that the models, similar to other AI systems, do not explain how they arrived at their decision, which will not help if other researchers want to understand the process.

So can AI solve the mystery of life? It was only in July 2022 that DeepMind Technologies released 3D-predicted structures of more than 200 million proteins found in plants, bacteria, animals and humans. Generated by DeepMind’s AI system, AlphaFold, the structures are helping researchers enhance their . This discovery was critical since proteins are the building blocks of human life, along with nucleic acids (DNA and RNA), lipids, and glycans, and folding allows a protein to adopt a functional shape or conformation.

If researchers can better predict how proteins fold, they will be able to better understand how cells function and how mis-folded proteins can cause diseases–known as the protein-folding problem. AlphaFold uses an AI system to predict a protein’s 3D structure from its amino acid chain. Announced in 2020, DeepMind released and open-sourced AlphaFold2, and the multi-terabyte AlphaFold Protein Structure Database (AlphaFold DB, which the company likens to a ‘google search’ for protein structures) in 2021.

A year later, DeepMind along with EMBL’s European Bioinformatics Institute (EMBL-EBI) released predicted structures for more than 200 million proteins, covering almost every catalogued protein in science. “One day, it might even help unlock the mysteries of how life itself works,” reads a Google blog. The importance of reaching atomic accuracy According to Google, 1.

4 million users in at least 190 countries have accessed the AlphaFold database till date. Scientists around the world have used AlphaFold’s predictions to help advance research on everything from accelerating new malaria vaccines and advancing cancer drug discovery to developing plastic-eating enzymes for tackling pollution. The Centre for Enzyme Innovation at the University of Portsmouth is using the AI system to help develop faster enzymes to recycle single-use plastics, while the University of California has used the predictions to better understand the biology of the Covid-19 (SARS-CoV-2) virus.

AlphaFold’s latest model can generate predictions for nearly all molecules in the Protein Data Bank, “ “. Levinthal’s paradox, named after molecular biologist Cyrus Levinthal, notes that even as proteins could fold in seconds or even milliseconds, it would take longer than the age of the known universe (about 13. 8 billion years) to calculate all possible configurations of a typical protein using brute force.

AI systems like AlphaFold and are leveraging advances in the application of AI to dramatically improve how drugs are discovered and developed. AlphaFold can do in seconds “what used to take many months or years”, according to Eric Topol, founder and director of the Scripps Research Translational Institute. .


From: livemint
URL: https://www.livemint.com/science/mint-explainer-the-implications-of-ai-creating-new-materials-in-seconds-11702352155757.html

DTN
DTN
Dubai Tech News is the leading source of information for people working in the technology industry. We provide daily news coverage, keeping you abreast of the latest trends and developments in this exciting and rapidly growing sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Must Read

Related News