# Difference between revisions of "Colloquia"

Line 70: | Line 70: | ||

(Hosted by Roch) | (Hosted by Roch) | ||

+ | |||

+ | '''Algorithms in invariant theory, connections and applications''' | ||

+ | |||

+ | For over a century, computation has played a key role in the development of invariant theory, a subject that studies symmetries captured by group actions. Over the years, major computational advances such as the advent of the digital computer, the discovery of Grobner basis techniques, the development of rigorous notions of computational complexity, etc have served as a stimulus for invariant theory. The perspective adopted in this talk will be contrary — I will explain how developments in invariant theory can inform and make progress on fundamental problems in computational subjects such as complexity theory and statistics. | ||

+ | |||

+ | I will discuss how central problems in complexity such as the celebrated P vs NP problem, graph isomorphism, and identity testing arise in the context of invariant theory, focusing on recent results in invariant theory that shed new light on identity testing. I will also outline the challenges going forward in this exciting and rapidly developing field. With regard to statistics, a surprising connection was discovered last year between stability notions in invariant theory and maximum likelihood estimation for a special class of statistical models. This connection allows for invariant theoretic approaches to statistical questions, e.g., we can give exact sample size thresholds for the widely used matrix (and tensor) normal models by utilizing results on quiver representations, castling transforms, etc. I will also briefly point at some exciting current and future directions in this context. No special background will be assumed in this talk. | ||

== February 19, 2021, [http://www.mauricefabien.com/ Maurice Fabien] (Brown)== | == February 19, 2021, [http://www.mauricefabien.com/ Maurice Fabien] (Brown)== |

## Revision as of 10:34, 1 February 2021

**UW Madison mathematics Colloquium is ONLINE on Fridays at 4:00 pm. **

# Spring 2021

## January 27, 2021 **[Wed 4-5pm]**, Morgane Austern (Microsoft Research)

(Hosted by Roch)

**Asymptotics of learning on dependent and structured random objects**

Classical statistical inference relies on numerous tools from probability theory to study the properties of estimators. However, these same tools are often inadequate to study modern machine problems that frequently involve structured data (e.g networks) or complicated dependence structures (e.g dependent random matrices). In this talk, we extend universal limit theorems beyond the classical setting.

Firstly, we consider distributionally “structured” and dependent random object–i.e random objects whose distribution are invariant under the action of an amenable group. We show, under mild moment and mixing conditions, a series of universal second and third order limit theorems: central-limit theorems, concentration inequalities, Wigner semi-circular law and Berry-Esseen bounds. The utility of these will be illustrated by a series of examples in machine learning, network and information theory. Secondly by building on these results, we establish the asymptotic distribution of the cross- validated risk with the number of folds allowed to grow at an arbitrary rate. Using this, we study the statistical speed-up of cross validation compared to a train-test split procedure, which reveals surprising results even when used on simple estimators.

## January 29, 2021, Isaac Harris (Purdue)

(Hosted by Smith)

**Direct Sampling Algorithms for Inverse Scattering**

In this talk, we will discuss a recent qualitative imaging method referred to as the Direct Sampling Method for inverse scattering. This method allows one to recover a scattering object by evaluating an imaging functional that is the inner-product of the far-field data and a known function. It can be shown that the imaging functional is strictly positive in the scatterer and decays as the sampling point moves away from the scatterer. The analysis uses the factorization of the far-field operator and the Funke-Hecke formula. This method can also be shown to be stable with respect to perturbations in the scattering data. We will discuss the inverse scattering problem for both acoustic and electromagnetic waves. This is joint work with A. Kleefeld and D.-L. Nguyen.

## February 1, 2021 **[Mon 4-5pm]**, Nan Wu (Duke)

(Hosted by Roch)

**From Manifold Learning to Gaussian Process Regression on Manifolds**

In this talk, I will review the concepts in manifold learning and discuss a famous manifold learning algorithm, the Diffusion Map. I will talk about my recent research results which theoretically justify that the Diffusion Map reveals the underlying topological structure of the dataset sampled from a manifold in a high dimensional space. Moreover, I will show the application of these theoretical results in solving the regression problems on manifolds and ecological problems in real life.

## February 5, 2021, Hanbaek Lyu (UCLA)

(Hosted by Roch)

**Dictionary Learning from dependent data samples and networks**

Analyzing group behavior of systems of interacting variables is a ubiquitous problem in many fields including probability, combinatorics, and dynamical systems. This problem also naturally arises when one tries to learn essential features (dictionary atoms) from large and structured data such as networks. For instance, independently sampling some number of nodes in a sparse network hardly detects any edges between adjacent nodes. Instead, we may perform a random walk on the space of connected subgraphs, which will produce more meaningful but correlated samples. As classical results in probability were first developed for independent variables and then gradually generalized for dependent variables, many algorithms in machine learning first developed for independent data samples now need to be extended to correlated data samples. In this talk, we discuss some new results that accomplish this including some for online nonnegative matrix and tensor factorization for Markovian data. A unifying technique for handling dependence in data samples we develop is to condition on the distant past, rather than the recent history. As an application, we present a new approach for learning "basis subgraphs" from network data, that can be used for network denoising and edge inference tasks. We illustrate our method using several synthetic network models as well as Facebook, arXiv, and protein-protein interaction networks, that achieve state-of-the-art performance for such network tasks when compared to several recent methods.

## February 8, 2021 **[Mon 4-5pm]**, Mohamed Ndaoud (USC)

(Hosted by Roch)

**SCALED MINIMAX OPTIMALITY IN HIGH-DIMENSIONAL LINEAR REGRESSION: A NON-CONVEX ALGORITHMIC REGULARIZATION APPROACH**

The question of fast convergence in the classical problem of high dimensional linear regression has been extensively studied. Arguably, one of the fastest procedures in practice is Iterative Hard Thresholding (IHT). Still, IHT relies strongly on the knowledge of the true sparsity parameter s. In this paper, we present a novel fast procedure for estimation in the high dimensional linear regression. Taking advantage of the interplay between estimation, support recovery and optimization we achieve both optimal statistical accuracy and fast convergence. The main advantage of our procedure is that it is fully adaptive, making it more practical than state of the art IHT methods. Our procedure achieves optimal statistical accuracy faster than, for instance, classical algorithms for the Lasso. Moreover, we establish sharp optimal results for both estimation and support recovery. As a consequence, we present a new iterative hard thresholding algorithm for high dimensional linear regression that is scaled minimax optimal (achieves the estimation error of the oracle that knows the sparsity pattern if possible), fast and adaptive.

## February 12, 2021, Bobby Wilson (University of Washington)

(Hosted by Smith)

## February 17, 2021 **[Wed 9-10am]**, Visu Makam (IAS)

(Hosted by Roch)

**Algorithms in invariant theory, connections and applications**

For over a century, computation has played a key role in the development of invariant theory, a subject that studies symmetries captured by group actions. Over the years, major computational advances such as the advent of the digital computer, the discovery of Grobner basis techniques, the development of rigorous notions of computational complexity, etc have served as a stimulus for invariant theory. The perspective adopted in this talk will be contrary — I will explain how developments in invariant theory can inform and make progress on fundamental problems in computational subjects such as complexity theory and statistics.

I will discuss how central problems in complexity such as the celebrated P vs NP problem, graph isomorphism, and identity testing arise in the context of invariant theory, focusing on recent results in invariant theory that shed new light on identity testing. I will also outline the challenges going forward in this exciting and rapidly developing field. With regard to statistics, a surprising connection was discovered last year between stability notions in invariant theory and maximum likelihood estimation for a special class of statistical models. This connection allows for invariant theoretic approaches to statistical questions, e.g., we can give exact sample size thresholds for the widely used matrix (and tensor) normal models by utilizing results on quiver representations, castling transforms, etc. I will also briefly point at some exciting current and future directions in this context. No special background will be assumed in this talk.

## February 19, 2021, Maurice Fabien (Brown)

(Hosted by Smith)

## February 26, 2021, Avi Wigderson (Princeton IAS)

(Hosted by Gurevich)

## March 12, 2021, Ivan Losev (Yale)

(Hosted by Gurevich)

**Modular representations of semisimple Lie algebras**

Representation theory seeks to understand ways in which a
given algebraic object (a group, an associative algebra, a Lie algebra
etc.) can be represented via linear operators on a vector space over a field.
What the representations are going to look like very much depends on the field
in question, and, in particular, on its characteristic.
Most important questions are settled in characteristic 0, for example,
when we work over the complex numbers. But in the case of postive
characteristic fields, which the word ``modular* refers to, even basic*
questions are wide open.

In my talk I will concentrate on one of the most important algebraic objects, semisimple Lie algebras, and explain what we know about about their irreducible (=no subrepresentations) modular representations. I will start with the case of sl_2 explaining the results of Rudakov and Shafarevich from 1967 describing the irreducible representations. Then I will talk about recent work on the general case including my paper with Bezrukavnikov from 2020, where we get the most explicit description of irreducible representations available to date. Our primary tool is relating the modular representations of semisimple Lie algebras to the (affine) Hecke category, the most fundamental object of modern Representation theory.

## March 26, 2021, []

(Hosted by )

## April 9, 2021 **[8pm]**, Weinan E (Princeton)

**Hans Schneider LAA Lecture** (Hosted by Shen)

## April 23, 2021, []

(Hosted by )