Skip to content

Researchers Aim to Solve the Unsolvable to Predict the Unseeable

November 18, 2019
Professors Christine Isborn and Harish Bhat, and not Schrodinger's cat.

A pair of UC Merced researchers are combining computational chemistry and machine learning principles to solve what seems to be an intractable problem at the heart of quantum mechanics: predicting the movement of electrons, also known as electron dynamics.

Over the next three years with a $1.2 million grant from the federal Department of Energy, School of Natural Sciences professors Christine Isborn and Harish Bhat hope to generate new models to describe electronic motion in molecules.

If they succeed, their work will help others accurately predict how molecules will respond to electromagnetic fields, which will greatly advance the design and development of materials for photovoltaics and photocatalysts that provide renewable energy, as well as energy storage technologies such as renewable batteries.

“This emerging combination of machine learning and chemistry is at the forefront of science and is gaining a lot of momentum in different areas of chemistry,” Isborn said.

"We want to understand the memory for this simple system, and then figure out how to scale it up to molecules with more than two electrons... This is high-risk, high-reward."

Professor Christine Isborn

Electron dynamics are governed by Schrödinger’s wave equation. You’ve heard of Schrödinger’s Cat — the one in the box who is both alive and dead at the same time; this means that the cat exists in a ‘superposition state’ where the cat can, theoretically, simultaneously be both alive and dead. Superposition is part of the weirdness of quantum mechanics that comes about from using a wave equation, and the math for solving this equation becomes too complex to solve with many electrons (or many cats).

When the Schrödinger’s Cat picture — the wave function that accounts for superposition — is mapped onto a simpler problem, this forces the potential that determines the electron motion to have a memory of the past. Similar to the “muscle memory” humans rely on to complete everyday tasks without needing to consciously ask our bodies to do them, such as breathing or picking up your keys, these potentials have memory that guides the electrons on the proper paths.

Current approximations completely ignore this memory, which means simulations are not accurate enough to really understand molecules and materials, Isborn said. The electronic potentials that Bhat and Isborn are trying to discover with machine learning methods will have memory. If successful, this will fix fundamental problems in time-dependent density functional theory, which powers dynamic simulations of many-electron systems.

Isborn and Bhat want to exactly solve Schrödinger’s wave equation with two electrons, capturing all the weirdness of quantum mechanics, then allow data science to ‘learn’ about the memory of the electronic potential. To do this, the researchers will apply supervised learning methods, which learn by example. Feed such a method enough input-output pairs and it will start to understand how to transform the inputs into the outputs.

“We want to understand the memory for this simple system, and then figure out how to scale it up to molecules with more than two electrons,” Isborn said. “Our first try at predicting electron dynamics might not work, but there are many techniques to try and any insight could be very important for the field. This is high-risk, high-reward.”

Isborn is a computational chemist in the Department of Chemistry, and Bhat is a data scientist with the Department of Applied Math. They and a collaborator at Stanford University and a few grad students and postdoctoral researchers will use data from hundreds of thousands of simulations and the deep-learning power of recurrent neural networks (RNNs) to more accurately predict the electron movement in complex molecular systems.

An RNN is an artificial neural network that uses memory to process information, such as in artificial intelligence that powers facial- or handwriting-recognition software. RNNs can learn about the memory of a system to predict what it will do next.

Bhat and Isborn will use these and other machine learning methods to make advances.

“We’ll be able to simulate lots of Schrödinger’s cats in lots of different situations,” Bhat said. “Then we can use that to build something predictive.”