Applied/ACMS/absF19: Difference between revisions

From UW-Math Wiki
Jump to navigation Jump to search
(Created page with "Leonardo Andrés Zepeda Núñez Title: Abstract:")
 
No edit summary
Line 1: Line 1:
Leonardo Andrés Zepeda Núñez
Leonardo Andrés Zepeda Núñez


Title:
Title: Deep Learning for Electronic Structure Computations: A Tale of Symmetries, Locality, and Physics


Abstract:
Abstract: Recently, the surge of interest in deep neural learning has dramatically improved image and signal processing, which has fueled breakthroughs in many domains such as drug discovery, genomics, and automatic translation. These advances have been further applied to scientific computing and, in particular, to electronic structure computations. In this case, the main objective is to directly compute the electron density, which encodes most of information of the system, thus bypassing the computationally intensive solution of the Kohn-Sham equations. However, similar to neural networks for image processing, the performance of the methods depends spectacularly on the physical and analytical intuition incorporated in the network, and on the training stage.
 
In this talk, I will show how to build a network that respects physical symmetries and locality. I will show how to train the networks and how such properties impact the performance of the resulting network. Finally, I will present several examples for small yet realistic chemical systems.

Revision as of 21:01, 3 September 2019

Leonardo Andrés Zepeda Núñez

Title: Deep Learning for Electronic Structure Computations: A Tale of Symmetries, Locality, and Physics

Abstract: Recently, the surge of interest in deep neural learning has dramatically improved image and signal processing, which has fueled breakthroughs in many domains such as drug discovery, genomics, and automatic translation. These advances have been further applied to scientific computing and, in particular, to electronic structure computations. In this case, the main objective is to directly compute the electron density, which encodes most of information of the system, thus bypassing the computationally intensive solution of the Kohn-Sham equations. However, similar to neural networks for image processing, the performance of the methods depends spectacularly on the physical and analytical intuition incorporated in the network, and on the training stage.

In this talk, I will show how to build a network that respects physical symmetries and locality. I will show how to train the networks and how such properties impact the performance of the resulting network. Finally, I will present several examples for small yet realistic chemical systems.