Graduate student reading seminar: Difference between revisions

From UW-Math Wiki
Jump to navigation Jump to search
No edit summary
 
(100 intermediate revisions by 17 users not shown)
Line 1: Line 1:
Time and place: Wednesday 3:30PM-4:30PM, B115
(... in probability)


==Schedule==
[https://groups.google.com/a/g-groups.wisc.edu/forum/#!forum/grad_prob_seminar Email list]


We will meet on Tuesdays at 2:25pm.
== 2024 Spring ==


The first meeting is on 9/10 in room B203. We will discuss the schedule and the format.


We read the textbook "Denumerable Markov Chains" by Wolfgang Woess.


Schedule:
== 2023 Fall ==
 
 
We read the textbook "Introduction to Malliavin calculus" by  David Nualart and Eulalia Nualart.
 
== 2023 Spring ==
We are reading parts of the textbook "Statistical Mechanics of Lattice Systems: a Concrete Mathematical Introduction" by Friedli and Velenik (<nowiki>https://www.unige.ch/math/folks/velenik/smbook/</nowiki>).
 
Feb 7: Benedek
 
Feb 14 Jiaming
 
Feb 21 Erik
 
Feb 28 David
 
Mar 7 Jiahong
 
Mar 21 Jiahong
 
Mar 28 Zhengwei
 
April 4 Yahui
 
April 11 Yahui
 
April 18 Zhengwei
 
April 25 Evan
 
May 2 Evan
 
== 2022 Fall ==
We read the following lecture notes of Hugo Dominil-Copin on percolation: https://www.unige.ch/~duminil/publi/2017percolation.pdf
 
== 2022 Spring ==
We read the first couple of chapters of the lecture notes “Gaussian free field, Liouville quantum gravity and Gaussian multiplicative chaos” by Nathanaël Berestycki and Ellen Powell (https://www.ams.org/open-math-notes/omn-view-listing?listingId=111291).
 
== 2021 Fall ==
We discussed the lecture notes "Lectures on integrable probability" by Borodin and Gorin (https://arxiv.org/abs/1212.3351)
 
==2020 Fall==
 
The graduate probability seminar will be on Zoom this semester. Please sign up for the email list if you would like to receive notifications about the talks.
 
==2020 Spring==
 
Tuesday 2:30pm,  901 Van Vleck
 
2/4, 2/11: Edwin
 
2/18, 2/25: Chaojie
 
3/3. 3/10: Yu Sun
 
3/24, 3/31: Tony
 
4/7, 4/14: Tung
 
4/21, 4/28: Tung
 
==2019 Fall==
 
Tuesday 2:30pm,  901 Van Vleck
 
9/24, 10/1: Xiao
 
10/8, 10/15: Jakwang
 
10/22, 10/29: Evan
 
11/5, 11/12: Chaojie
 
12/3, 12/10: Tung
 
==2019 Spring==
 
Tuesday 2:30pm,  901 Van Vleck
 
2/5: Timo
 
2/12, 2/19: Evan
 
2/26, 3/5: Chaojie
 
3/12, 3/26: Kurt
 
4/2, 4/9: Yu
 
4/16, 4/23: Max
 
4/30, 5/7: Xiao
 
==2018 Fall==
 
Tuesday 2:30pm,  901 Van Vleck
 
 
The topic this semester is large deviation theory. Send me (BV) an email, if you want access to the shared Box folder with some reading material.
 
 
9/25, 10/2: Dae Han
 
10/9, 10/16: Kurt
 
10/23, 10/30: Jane Davis
 
11/6, 11/13: Brandon Legried
 
11/20, 11/27: Shuqi Yu
 
12/4, 12/11: Yun Li
 
==2018 Spring==
 
Tuesday 2:30pm,  B135 Van Vleck
 
 
Preliminary schedule:
 
2/20, 2/27: Yun
 
3/6, 3/13: Greg
 
3/20, 4/3: Yu
 
4/10, 4/17: Shuqi
 
4/24, 5/1: Tony
 
==2017 Fall==
 
Tuesday 2:30pm,  214 Ingraham Hall
 
 
Preliminary schedule:
 
9/26, 10/3: Hans
 
10/10, 10/17: Guo
 
10/24, 10/31: Chaoji
 
11/7, 11/14: Yun
 
11/21, 11/28: Kurt
 
12/5, 12/12: Christian
 
 
 
 
==2017 Spring==
 
Tuesday 2:25pm, B211
 
1/31, 2/7: Fan
 
I will talk about the Hanson-Wright inequality, which is a large deviation estimate for random variable of the form X^* A X, where X is a random vector with independent subgaussian entries and A is an arbitrary deterministic matrix. In the first talk, I will present a beautiful proof given by Mark Rudelson and Roman Vershynin. In the second talk, I will talk about some applications of this inequality.
 
Reference: M. Rudelson and R. Vershynin, Hanson-Wright inequality and sub-gaussian concentration, Electron. Commun. Probab. Volume 18 (2013).
 
3/7, 3/14 : Jinsu
 
Title : Donsker's Theorem and its application.
Donsker's Theorem roughly says normalized random walk with linear interpolation on time interval [0,1] weakly converges to the Brownian motion B[0,1] in C([0,1]). It is sometimes called Donsker's invariance principle or the functional central limit theorem.  I will show main ideas for the proof of this theorem tomorrow and show a couple of applications in my 2nd talk.
 
Reference : https://www.math.utah.edu/~davar/ps-pdf-files/donsker.pdf
 
==2016 Fall==
 
9/27 Daniele
 
Stochastic reaction networks.
 
Stochastic reaction networks are continuous time Markov chain models used primarily in biochemistry. I will define them, prove some results that connect them to related deterministic models and introduce some open questions.
 
10/4 Jessica
 
10/11, 10/18: Dae Han
 
10/25, 11/1: Jinsu
 
Coupling of Markov processes.
 
When we have two distributions on same probability space, we can think of a pair whose marginal probability is each of two distributions.
This pairing can be used to estimate the total variation distance between two distributions. This idea is called coupling method.
I am going to introduce basic concepts,ideas and applications of coupling for Markov processes.
 
Links of References
 
http://pages.uoregon.edu/dlevin/MARKOV/markovmixing.pdf
 
http://websites.math.leidenuniv.nl/probability/lecturenotes/CouplingLectures.pdf
 
11/8, 11/15: Hans
 
11/22, 11/29: Keith
 
Surprisingly Determinental: DPPs and some asymptotics of ASEP
 
I'll be reading and presenting some recent papers of Alexei Borodin and a few collaborators which have uncovered certain equivalences between determinental point processes and non-determinental processes.
 
 
==2016 Spring==
 
Tuesday, 2:25pm, B321 Van Vleck
 
 
3/29, 4/5: Fan Yang
 
I will talk about the ergodic decomposition theorem (EDT). More specifically, given a compact metric space X and a continuous transformation T on it, the theorem shows that any T-invariant measure on X can be decomposed into a convex combination of ergodic measures. In the first talk I introduced the EDT and some related facts. In the second talk, I will talk about the conditional measures, and prove that the ergodic measures in EDT are indeed the conditional measures.
 
 
2/16 : Jinsu
 
Lyapunov function for Markov Processes.
 
For ODE, we can show stability of the trajectory using Lyapunov functions.
 
There is an analogy for Markov Processes. I'd like to talk about the existence of stationary distribution with Lyapunov function.
 
In some cases, it is also possible to show the rate of convergence to the stationary distribution.
 
==2015 Fall==
 
This semester we will focus on tools and methods.
 
[https://www.math.wisc.edu/wiki/images/a/ac/Reading_seminar_2015.pdf Seminar notes] ([https://www.dropbox.com/s/f4km7pevwfb1vbm/Reading%20seminar%202015.tex?dl=1 tex file], [https://www.dropbox.com/s/lg7kcgyf3nsukbx/Reading_seminar_2015.bib?dl=1 bib file])
 
9/15, 9/22: Elnur
 
I will talk about large deviation theory and its applications. For the first talk, my plan is to introduce Gartner-Ellis theorem and show a few applications of it to finite state discrete time Markov chains.
 
9/29, 10/6,  10/13 :Dae Han
 
10/20, 10/27, 11/3: Jessica
 
I will first present an overview of concentration of measure and concentration inequalities with a focus on the connection with related topics in analysis and geometry. Then, I will present Log-Sobolev inequalities and their connection to concentration of measure.
 
11/10, 11/17: Hao Kai
 
11/24, 12/1, 12/8, 12/15: Chris
 
 
 
 
 
 
2016 Spring:
 
2/2, 2/9: Louis
 
 
2/16, 2/23: Jinsu
 
3/1, 3/8: Hans
 
==2015 Spring==
 
 
2/3, 2/10: Scott
 
An Introduction to Entropy for Random Variables
 
In these lectures I will introduce entropy for random variables and present some simple, finite state-space, examples to gain some intuition. We will prove the
MacMillan Theorem using entropy and the law of large numbers. Then I will introduce relative entropy and prove the Markov Chain Convergence Theorem. Finally I will
define entropy for a discrete time process. The lecture notes can be found at http://www.math.wisc.edu/~shottovy/EntropyLecture.pdf.
 
2/17, 2/24: Dae Han
 
3/3, 3/10: Hans
 
3/17, 3/24: In Gun
 
4/7, 4/14: Jinsu
 
4/21, 4/28: Chris N.
 
 
 
 
 
 
==2014 Fall==
 
9/23: Dave
 
I will go over Mike Giles’ 2008 paper “Multi-level Monte Carlo path simulation.”  This paper introduced a new Monte Carlo method to approximate expectations of SDEs (driven by Brownian motions) that is significantly more efficient than what was the state of the art. This work opened up a whole new field in the numerical analysis of stochastic processes as the basic idea is quite flexible and has found a variety of applications including SDEs driven by Brownian motions, Levy-driven SDEs, SPDEs, and models from biology
 
9/30: Benedek
 
A very quick introduction to Stein's method.
 
I will give a brief introduction to Stein's method, mostly based on the the first couple of sections of the following survey article:
 
Ross, N. (2011). Fundamentals of Stein’s method. Probability Surveys, 8, 210-293.
 
The following webpage has a huge collection of resources if you want to go deeper: https://sites.google.com/site/yvikswan/about-stein-s-method
 
 
Note that the Midwest Probability Colloquium  (http://www.math.northwestern.edu/mwp/) will have a tutorial program on Stein's method this year.
 
10/7, 10/14: Chris J.
[http://www.math.wisc.edu/~janjigia/research/MartingaleProblemNotes.pdf An introduction to the (local) martingale problem.]
 
 
10/21, 10/28: Dae Han
 
11/4, 11/11: Elnur
 
11/18, 11/25: Chris N.    Free Probability with an emphasis on C* and Von Neumann Algebras
 
12/2, 12/9: Yun Zhai
 
==2014 Spring==
 
 
1/28: Greg
 
2/04, 2/11: Scott
 
[http://www.math.wisc.edu/~shottovy/BLT.pdf Reflected Brownian motion, Occupation time, and applications.]
 
2/18: Phil-- Examples of structure results in probability theory.
 
2/25, 3/4: Beth-- Derivative estimation for discrete time Markov chains
 
3/11, 3/25: Chris J [http://www.math.wisc.edu/~janjigia/research/stationarytalk.pdf Some classical results on stationary distributions of Markov processes]
 
4/1, 4/8: Chris N 
 
4/15, 4/22: Yu Sun
 
4/29. 5/6: Diane
 
==2013 Fall==


9/24, 10/1: Chris
9/24, 10/1: Chris
Line 22: Line 359:


11/12: Yun
11/12: Yun
Helffer-Sjostrand representation and Brascamp-Lieb inequality for stochastic interface models


11/19, 11/26: Yu Sun
11/19, 11/26: Yu Sun

Latest revision as of 21:24, 3 September 2024

(... in probability)

Email list

2024 Spring

We read the textbook "Denumerable Markov Chains" by Wolfgang Woess.

2023 Fall

We read the textbook "Introduction to Malliavin calculus" by  David Nualart and Eulalia Nualart.

2023 Spring

We are reading parts of the textbook "Statistical Mechanics of Lattice Systems: a Concrete Mathematical Introduction" by Friedli and Velenik (https://www.unige.ch/math/folks/velenik/smbook/).

Feb 7: Benedek

Feb 14 Jiaming

Feb 21 Erik

Feb 28 David

Mar 7 Jiahong

Mar 21 Jiahong

Mar 28 Zhengwei

April 4 Yahui

April 11 Yahui

April 18 Zhengwei

April 25 Evan

May 2 Evan

2022 Fall

We read the following lecture notes of Hugo Dominil-Copin on percolation: https://www.unige.ch/~duminil/publi/2017percolation.pdf

2022 Spring

We read the first couple of chapters of the lecture notes “Gaussian free field, Liouville quantum gravity and Gaussian multiplicative chaos” by Nathanaël Berestycki and Ellen Powell (https://www.ams.org/open-math-notes/omn-view-listing?listingId=111291).

2021 Fall

We discussed the lecture notes "Lectures on integrable probability" by Borodin and Gorin (https://arxiv.org/abs/1212.3351)

2020 Fall

The graduate probability seminar will be on Zoom this semester. Please sign up for the email list if you would like to receive notifications about the talks.

2020 Spring

Tuesday 2:30pm, 901 Van Vleck

2/4, 2/11: Edwin

2/18, 2/25: Chaojie

3/3. 3/10: Yu Sun

3/24, 3/31: Tony

4/7, 4/14: Tung

4/21, 4/28: Tung

2019 Fall

Tuesday 2:30pm, 901 Van Vleck

9/24, 10/1: Xiao

10/8, 10/15: Jakwang

10/22, 10/29: Evan

11/5, 11/12: Chaojie

12/3, 12/10: Tung

2019 Spring

Tuesday 2:30pm, 901 Van Vleck

2/5: Timo

2/12, 2/19: Evan

2/26, 3/5: Chaojie

3/12, 3/26: Kurt

4/2, 4/9: Yu

4/16, 4/23: Max

4/30, 5/7: Xiao

2018 Fall

Tuesday 2:30pm, 901 Van Vleck


The topic this semester is large deviation theory. Send me (BV) an email, if you want access to the shared Box folder with some reading material.


9/25, 10/2: Dae Han

10/9, 10/16: Kurt

10/23, 10/30: Jane Davis

11/6, 11/13: Brandon Legried

11/20, 11/27: Shuqi Yu

12/4, 12/11: Yun Li

2018 Spring

Tuesday 2:30pm, B135 Van Vleck


Preliminary schedule:

2/20, 2/27: Yun

3/6, 3/13: Greg

3/20, 4/3: Yu

4/10, 4/17: Shuqi

4/24, 5/1: Tony

2017 Fall

Tuesday 2:30pm, 214 Ingraham Hall


Preliminary schedule:

9/26, 10/3: Hans

10/10, 10/17: Guo

10/24, 10/31: Chaoji

11/7, 11/14: Yun

11/21, 11/28: Kurt

12/5, 12/12: Christian



2017 Spring

Tuesday 2:25pm, B211

1/31, 2/7: Fan

I will talk about the Hanson-Wright inequality, which is a large deviation estimate for random variable of the form X^* A X, where X is a random vector with independent subgaussian entries and A is an arbitrary deterministic matrix. In the first talk, I will present a beautiful proof given by Mark Rudelson and Roman Vershynin. In the second talk, I will talk about some applications of this inequality.

Reference: M. Rudelson and R. Vershynin, Hanson-Wright inequality and sub-gaussian concentration, Electron. Commun. Probab. Volume 18 (2013).

3/7, 3/14 : Jinsu

Title : Donsker's Theorem and its application. Donsker's Theorem roughly says normalized random walk with linear interpolation on time interval [0,1] weakly converges to the Brownian motion B[0,1] in C([0,1]). It is sometimes called Donsker's invariance principle or the functional central limit theorem. I will show main ideas for the proof of this theorem tomorrow and show a couple of applications in my 2nd talk.

Reference : https://www.math.utah.edu/~davar/ps-pdf-files/donsker.pdf

2016 Fall

9/27 Daniele

Stochastic reaction networks.

Stochastic reaction networks are continuous time Markov chain models used primarily in biochemistry. I will define them, prove some results that connect them to related deterministic models and introduce some open questions.

10/4 Jessica

10/11, 10/18: Dae Han

10/25, 11/1: Jinsu

Coupling of Markov processes.

When we have two distributions on same probability space, we can think of a pair whose marginal probability is each of two distributions. This pairing can be used to estimate the total variation distance between two distributions. This idea is called coupling method. I am going to introduce basic concepts,ideas and applications of coupling for Markov processes.

Links of References

http://pages.uoregon.edu/dlevin/MARKOV/markovmixing.pdf

http://websites.math.leidenuniv.nl/probability/lecturenotes/CouplingLectures.pdf

11/8, 11/15: Hans

11/22, 11/29: Keith

Surprisingly Determinental: DPPs and some asymptotics of ASEP

I'll be reading and presenting some recent papers of Alexei Borodin and a few collaborators which have uncovered certain equivalences between determinental point processes and non-determinental processes.


2016 Spring

Tuesday, 2:25pm, B321 Van Vleck


3/29, 4/5: Fan Yang

I will talk about the ergodic decomposition theorem (EDT). More specifically, given a compact metric space X and a continuous transformation T on it, the theorem shows that any T-invariant measure on X can be decomposed into a convex combination of ergodic measures. In the first talk I introduced the EDT and some related facts. In the second talk, I will talk about the conditional measures, and prove that the ergodic measures in EDT are indeed the conditional measures.


2/16 : Jinsu

Lyapunov function for Markov Processes.

For ODE, we can show stability of the trajectory using Lyapunov functions.

There is an analogy for Markov Processes. I'd like to talk about the existence of stationary distribution with Lyapunov function.

In some cases, it is also possible to show the rate of convergence to the stationary distribution.

2015 Fall

This semester we will focus on tools and methods.

Seminar notes (tex file, bib file)

9/15, 9/22: Elnur

I will talk about large deviation theory and its applications. For the first talk, my plan is to introduce Gartner-Ellis theorem and show a few applications of it to finite state discrete time Markov chains.

9/29, 10/6, 10/13 :Dae Han

10/20, 10/27, 11/3: Jessica

I will first present an overview of concentration of measure and concentration inequalities with a focus on the connection with related topics in analysis and geometry. Then, I will present Log-Sobolev inequalities and their connection to concentration of measure.

11/10, 11/17: Hao Kai

11/24, 12/1, 12/8, 12/15: Chris



2016 Spring:

2/2, 2/9: Louis


2/16, 2/23: Jinsu

3/1, 3/8: Hans

2015 Spring

2/3, 2/10: Scott

An Introduction to Entropy for Random Variables

In these lectures I will introduce entropy for random variables and present some simple, finite state-space, examples to gain some intuition. We will prove the MacMillan Theorem using entropy and the law of large numbers. Then I will introduce relative entropy and prove the Markov Chain Convergence Theorem. Finally I will define entropy for a discrete time process. The lecture notes can be found at http://www.math.wisc.edu/~shottovy/EntropyLecture.pdf.

2/17, 2/24: Dae Han

3/3, 3/10: Hans

3/17, 3/24: In Gun

4/7, 4/14: Jinsu

4/21, 4/28: Chris N.




2014 Fall

9/23: Dave

I will go over Mike Giles’ 2008 paper “Multi-level Monte Carlo path simulation.” This paper introduced a new Monte Carlo method to approximate expectations of SDEs (driven by Brownian motions) that is significantly more efficient than what was the state of the art. This work opened up a whole new field in the numerical analysis of stochastic processes as the basic idea is quite flexible and has found a variety of applications including SDEs driven by Brownian motions, Levy-driven SDEs, SPDEs, and models from biology

9/30: Benedek

A very quick introduction to Stein's method.

I will give a brief introduction to Stein's method, mostly based on the the first couple of sections of the following survey article:

Ross, N. (2011). Fundamentals of Stein’s method. Probability Surveys, 8, 210-293.

The following webpage has a huge collection of resources if you want to go deeper: https://sites.google.com/site/yvikswan/about-stein-s-method


Note that the Midwest Probability Colloquium (http://www.math.northwestern.edu/mwp/) will have a tutorial program on Stein's method this year.

10/7, 10/14: Chris J. An introduction to the (local) martingale problem.


10/21, 10/28: Dae Han

11/4, 11/11: Elnur

11/18, 11/25: Chris N. Free Probability with an emphasis on C* and Von Neumann Algebras

12/2, 12/9: Yun Zhai

2014 Spring

1/28: Greg

2/04, 2/11: Scott

Reflected Brownian motion, Occupation time, and applications.

2/18: Phil-- Examples of structure results in probability theory.

2/25, 3/4: Beth-- Derivative estimation for discrete time Markov chains

3/11, 3/25: Chris J Some classical results on stationary distributions of Markov processes

4/1, 4/8: Chris N

4/15, 4/22: Yu Sun

4/29. 5/6: Diane

2013 Fall

9/24, 10/1: Chris A light introduction to metastability

10/8, Dae Han Majoring multiplicative cascades for directed polymers in random media

10/15, 10/22: no reading seminar

10/29, 11/5: Elnur Limit fluctuations of last passage times

11/12: Yun Helffer-Sj?ostrand representation and Brascamp-Lieb inequality for stochastic interface models

11/19, 11/26: Yu Sun

12/3, 12/10: Jason

2013 Spring

2/13: Elnur

Young diagrams, RSK correspondence, corner growth models, distribution of last passage times.

2/20: Elnur

2/27: Chris

A brief introduction to enlargement of filtration and the Dufresne identity Notes

3/6: Chris

3/13: Dae Han

An introduction to random polymers

3/20: Dae Han

Directed polymers in a random environment: path localization and strong disorder

4/3: Diane

Scale and Speed for honest 1 dimensional diffusions

References:
Rogers & Williams - Diffusions, Markov Processes and Martingales
Ito & McKean - Diffusion Processes and their Sample Paths
Breiman - Probability
http://www.statslab.cam.ac.uk/~beresty/Articles/diffusions.pdf

4/10: Diane

4/17: Yun

Introduction to stochastic interface models

4/24: Yun

Dynamics and Gaussian equilibrium sytems

5/1: This reading seminar will be shifted because of a probability seminar.


5/8: Greg, Maso

The Bethe ansatz vs. The Replica Trick. This lecture is an overview of the two approaches. See [1] for a nice overview.

5/15: Greg, Maso

Rigorous use of the replica trick.