The development of sustainable energy depends on our ability to design valuable molecules and materials. Combining theoretical and experimental studies in concert is a particularly effective strategy in this respect. Using computational chemistry and materials science, researchers can put together the pieces of puzzles in ways that experiments alone cannot acquire. Conversely, theory could not make accurate predictions without input from the powerful experiment techniques available today.

To facilitate the two-way street, Volker Blum, with the Center for Hybrid Organic-Inorganic Semiconductors for Energy (CHOISE) Energy Frontier Research Center (EFRC), and Yosuke Kanai, with the Alliance for Molecular Photoelectrode Design for Solar Fuels (AMPED) EFRC, organized a two-day training course. The workshop, funded by several agencies, was held at Duke University in September.

Blum, Kanai, and their colleagues introduced theoretical principles and practical computations to early career scientists.

“Computational modeling is a powerful tool to understand experimental phenomena and to rationally accelerate scientific discovery of useful materials,” said Blum. At CHOISE, the computational approach shines new light on the organic-inorganic crystals that have been promising in solar energy techniques.

“To advance sciences broadly, we need to have better and closer communication between theorists and experimentalists about what the computation is capable of and what it is not good at,” said Kanai. The theoretical chemistry at AMPED is a key part of the work to study the dynamics of charge carriers, providing insights into the performance of solar fuels systems.

Working with experimental collaborators in EFRCs and involving them in the theoretical modeling is as important as improving the computational methods.

**Getting inside the black box **

Theoretical simulation is done using computational software. Computation programs for materials modeling are not merely black boxes, as many experimentalists think; a programming box will always return results, which, without further scrutiny, may look unsuspicious even in the presence of coding errors. It would be dangerous to interpret simulation without understanding the underlying principles and its limitations.

To raise awareness, AMPED and CHOISE researchers organized the workshop around getting inside the black-box simulations, with a particular focus on density functional theory (DFT).

DFT was designed to simplify the solutions of the famous Schrödinger equation in quantum mechanics that can, in principle, tell scientists everything about molecules they are interested in, including their energy, interactions, and evolution. However, finding the solution to the equation—the wavefunction—is too complex. Besides, the wavefunction is not something scientists can measure in the lab. This is an especially big problem for scientists in materials research because of the complexity and vast number of atoms in materials.

The wavefunction is a wild mathematical beast. Taming the beast is difficult.

**Getting around the unsolvable beast**

The solution to the conundrum? Instead of directly solving the equation, researchers use the probability of finding the electrons in a certain location—the electron density—to represent the wavefunction while preserving the essential information regarding electron interactions. Because the electron distribution can be physically measured, the experimental observations can be used to reflect and modify theoretical predictions.

Thus far, DFT has gained popularity for its ability and accuracy in dealing with systems with hundreds of thousands of electrons. In practical calculations, scientists use combinations of estimated parameters to describe the binding and associated energy between atoms. One of the workshop lecturers, Weitao Yang, from the Center for Complex Materials from First Principles (CCM) EFRC, leads one of the groups that pioneers improving such parameters for their crucial roles in performing DFT calculations.

**Computing infinitely repeated atoms**

After laying the foundations of the DFT basics, the lectures on day two turned the page to practical ways in which theorists simulate solids and condensed materials. The methods can more efficiently exploit the infinite number of atoms in extended systems, such as solids and liquids, particularly studies on complex surfaces and interfaces systems.

The approaches can also be applied to study the motions of the interacting atoms and molecules—the molecular dynamics simulations—which is an important tool in bioenergy research. Yaroslava Yingling, from the Center for Lignocellulose Structure and Formation (CLSF) EFRC, spoke at the workshop and illustrated the powerful applications of computationally more affordable molecular dynamics to visualize the changes of intermolecular interactions.

At CLSF, the work of molecular dynamics simulations reveals the process of cellulose fiber production in plant cell walls, which, in turn, can allow scientists to better devise biorenewable resources.

**Walking the talk**

Both days of the workshop ended with hands-on tutorials, giving the participants practical experience in computation. The tutorials provided example sets to calculate the energy of a single atom, then multi-atomic molecules, and finally, more complex crystalline structures. The goal was not to teach the participants how to write codes but to provide a flavor of how such computations work.

But setting up such an event wasn’t easy. It’s a team effort from the Blum and Kanai research groups that makes the dream work. “One needs a motivated group and group members who can assist in computer setup and tutoring. Without them, the tutorials would not be possible and, thus, neither would a successful workshop,” said Blum.

The workshop was quite popular with early career scientists. “We were surprised by receiving a much larger number of applications [than we expected]. We had to select the participants due to the limited seats available in the tutorial room,” said Kanai. “Although we wanted to accommodate all different groups, not only Ph.D. students but also post-docs and advanced undergrads, this might have been a little too ambitious.”

The event was full of engaging conversations exchanged between participants and lecturers. When it came to one question regarding the accuracy of simulations, the lecturers advised using at least two simulative programs, or codes, to confirm the consistency. This might seem trivial, but it is important to have the results be respected as reliable research methods.

For those who are inspired and starting to appreciate computation, Kanai suggested, “Start by reading books and then by running some calculations.” Blum recommended, “Look out for experienced people and ask them for advice. The learning process in computation is similar to learning to run an experiment: start with something simple and accessible and then execute more difficult and complex tasks.”

**Looking ahead**

The workshop gave early career scientists a glimpse into the interplay between theory and experiment. The knowledge acquired in the workshop would enable these young scientists to navigate their research with the boost of simulative modeling. To the hosts, holding workshops is an important part of outreach, as the theoretical efforts are often underrepresented. Overall, interdisciplinary collaborations and communications are key to scientific breakthroughs and create better energy solutions across EFRCs.