Hamartia Antidote
Frequent Poster
- Nov 17, 2013
- 44,043
- 25,238
- Country of Origin
- Country of Residence

Google DeepMind's AI Dreamed Up 380,000 New Materials. The Next Challenge Is Making Them
Google DeepMind researchers say theyâve expanded the number of known stable materials tenfold. Some could be useful for everything from batteries to superconductorsâif they make it out of the lab.
Google DeepMind researchers say theyâve expanded the number of known stable materials tenfold. Some could be useful for everything from batteries to superconductorsâif they make it out of the lab.
The robotic line cooks were deep in their recipe, toiling away in a room tightly packed with equipment. In one corner, an articulated arm selected and mixed ingredients, while another slid back and forth on a fixed track, working the ovens. A third was on plating duty, carefully shaking the contents of a crucible onto a dish. Gerbrand Ceder, a materials scientist at Lawrence Berkeley National Lab and UC Berkeley, nodded approvingly as a robotic arm delicately pinched and capped an empty plastic vialâan especially tricky task, and one of his favorites to observe. âThese guys can work all night,â Ceder said, giving two of his grad students a wry look.
Stocked with ingredients like nickel oxide and lithium carbonate, the facility, called the A-Lab, is designed to make new and interesting materials, especially ones that might be useful for future battery designs. The results can be unpredictable. Even a human scientist usually gets a new recipe wrong the first time. So sometimes the robots produce a beautiful powder. Other times itâs a melted gluey mess, or it all evaporates and thereâs nothing left. âAt that point, the humans would have to make a decision: What do I do now?â Ceder says.
The robots are meant to do the same. They analyze what theyâve made, adjust the recipe, and try again. And again. And again. âYou give them some recipes in the morning and when you come back home you might have a nice new soufflĂ©,â says materials scientist Kristin Persson, Cederâs close collaborator at LBNL (and also spouse). Or you might just return to a burned-up mess. âBut at least tomorrow theyâll make a much better soufflĂ©.â
Recently, the range of dishes available to Cederâs robots has grown exponentially, thanks to an AI program developed by Google DeepMind. Called GNoME, the software was trained using data from the Materials Project, a free-to-use database of 150,000 known materials overseen by Persson. Using that information, the AI system came up with designs for 2.2 million new crystals, of which 380,000 were predicted to be stableânot likely to decompose or explode, and thus the most plausible candidates for synthesis in a labâexpanding the range of known stable materials nearly 10-fold. In a paper published today in Nature, the authors write that the next solid-state electrolyte, or solar cell materials, or high-temperature superconductor, could hide within this expanded database.
Finding those needles in the haystack starts off with actually making them, which is all the more reason to work quickly and through the night. In a recent set of experiments at LBNL, also published today in Nature, Cederâs autonomous lab was able to create 41 of the theorized materials over 17 days, helping to validate both the AI model and the labâs robotic techniques.
When deciding if a material can actually be made, whether by human hands or robot arms, among the first questions to ask is whether it is stable. Generally, that means that its collection of atoms are arranged into the lowest possible energy state. Otherwise, the crystal will want to become something else. For thousands of years, people have steadily added to the roster of stable materials, initially by observing those found in nature or discovering them through basic chemical intuition or accidents. More recently, candidates have been designed with computers.
The problem, according to Persson, is bias: Over time, that collective knowledge has come to favor certain familiar structures and elements. Materials scientists call this the âEdison effect,â referring to his rapid trial-and-error quest to deliver a lightbulb filament, testing thousands of types of carbon before arriving at a variety derived from bamboo. It took another decade for a Hungarian group to come up with tungsten. âHe was limited by his knowledge,â Persson says. âHe was biased, he was convinced.â
DeepMindâs approach is meant to look beyond those biases. The team started with 69,000 materials from Perssonâs library, which is free to use and funded by the US Department of Energy. That was a good start, because the database contains the detailed energetic information needed to understand why some materials are stable and others arenât. But it wasnât enough data to overcome what Google DeepMind researcher Ekin Dogus Cubuk calls a âphilosophical contradictionâ between machine learning and empirical science. Like Edison, AI struggles to generate truly novel ideas beyond what it has seen before. âIn physics, you never want to learn a thing that you already know,â he says. âYou almost always want to generalize out of domainââwhether thatâs to discover a different class of battery material or a new superconductivity theory.
GNoME relies on an approach called active learning. First, an AI called a graph neural network, or GNN, uses the database to learn patterns in the stable structures and figure out how to minimize the energy in the atomic bonds within new structures. Using the whole range of the periodic table, it then produces thousands of potentially stable candidates. The next step is to verify and adjust them, using a quantum mechanics technique called density-functional theory, or DFT. These refined results are then plugged back into the training data and the process is repeated.

The structures of 12 compounds in the Materials Project database.ILLUSTRATION: JENNY NUSS/BERKELEY LAB
The researchers found that, with multiple repetitions, this approach could generate more complex structures than were initially in the Materials Project data set, including some that were composed of five or six unique elements. (The data set used to train the AI largely capped out at four.) Those types of materials involve so many complex atomic interactions that they generally escape human intuition. âThey were hard to find,â Cubuk says. âBut now theyâre not so hard to find anymore.â
But DFT is only a theoretical validation. The next step is actually making something. So Cederâs team picked 58 crystals to create in the A-Lab. After taking into account the capabilities of the lab and available precursors, it was a random selection. And at first, as expected, the robots failed, then repeatedly adjusted their recipes. After 17 days of experiments, the A-Lab managed to produce 41 of the materials, or 71 percent, sometimes after trying a dozen different recipes.
Taylor Sparks, a materials scientist at the University of Utah who wasnât involved in the research, says that itâs promising to see automation at work for new types of materials synthesis. But using AI to propose thousands of new hypothetical materials, and then chasing after them with automation, just isnât practical, he adds. GNNs are becoming widely used to develop new ideas for materials, but usually researchers want to tailor their efforts to produce materials with useful propertiesânot blindly produce hundreds of thousands of them. âWeâve already had way too many things that weâve wanted to investigate than we physically could,â he says. âI think the challenge is, is this scaled synthesis approaching the scale of the predictions? Not even close.â