SIAM Student Chapter Seminar: Difference between revisions

From UW-Math Wiki
Jump to navigation Jump to search
No edit summary
Line 35: Line 35:
|[https://uwmadison.zoom.us/j/99844791267?pwd=eUFwM25Hc2Roc1kvSzR3N2tVVlpLQT09 Virtual] and 911 Van Vleck
|[https://uwmadison.zoom.us/j/99844791267?pwd=eUFwM25Hc2Roc1kvSzR3N2tVVlpLQT09 Virtual] and 911 Van Vleck
|[https://you.stonybrook.edu/reutergroup/ Matt Reuter] (Stony Brook University)
|[https://you.stonybrook.edu/reutergroup/ Matt Reuter] (Stony Brook University)
|
|Becoming a Ghost Buster
|-
|-
|10/19 ('''Wednesday at 4 PM)'''
|10/19 ('''Wednesday at 4 PM)'''
Line 83: Line 83:


'''10/7 Jie Wang:''' We study distributionally robust optimization with Sinkhorn distance -- a variant of Wasserstein distance based on entropic regularization. We derive convex programming dual reformulations when the nominal distribution is an empirical distribution and a general distribution, respectively. Compared with Wasserstein DRO, it is computationally tractable for a larger class of loss functions, and its worst-case distribution is more reasonable. We propose an efficient stochastic mirror descent algorithm to solve the dual reformulation with provable convergence guarantees. Finally, we provide various numerical examples using both synthetic and real data to demonstrate its competitive performance and light computation cost.  
'''10/7 Jie Wang:''' We study distributionally robust optimization with Sinkhorn distance -- a variant of Wasserstein distance based on entropic regularization. We derive convex programming dual reformulations when the nominal distribution is an empirical distribution and a general distribution, respectively. Compared with Wasserstein DRO, it is computationally tractable for a larger class of loss functions, and its worst-case distribution is more reasonable. We propose an efficient stochastic mirror descent algorithm to solve the dual reformulation with provable convergence guarantees. Finally, we provide various numerical examples using both synthetic and real data to demonstrate its competitive performance and light computation cost.  
'''10/12 Matt Reuter:''' As children, most of us didn't know what we wanted to be "when we grew up," and, when asked, might have said, "an astronaut" or "a firefighter." I wanted to be a Ghost Buster and, pragmatically, wound up in computational chemistry and applied mathematics. In this talk, I'll discuss the winding path of my career from school to the national laboratory system to tenure-track faculty to teaching-line faculty. Along the way I’ll discuss my work exorcising (1) numerical ghosts from nanoscience research and (2) psychological ghosts from students when teaching mathematics.
==Past Semesters==
==Past Semesters==
*[[Spring 2022 SIAM|Spring 2022]]
*[[Spring 2022 SIAM|Spring 2022]]

Revision as of 18:14, 12 October 2022



Fall 2022

Date (1 PM unless otherwise noted) Location Speaker Title
9/23 Virtual and 911 Van Vleck Thomas Anderson (University of Michigan) A few words on potential theory in modern applied math
9/30 (11 AM) Virtual and 911 Van Vleck Jeff Hammond (Principal Engineer at NVIDIA) Industry talk
10/7 Virtual and 911 Van Vleck Jie Wang (Georgia Institute of Technology) Sinkhorn Distributionally Robust Optimization
10/14 Virtual and 911 Van Vleck Matt Reuter (Stony Brook University) Becoming a Ghost Buster
10/19 (Wednesday at 4 PM) Virtual and 911 Van Vleck Ying Li
10/28 911 Van Vleck Yinling Zhang (UW-Madison)
11/4 911 Van Vleck Haley Colgate (UW-Madison)
11/11 911 Van Vleck Zinan Wang (UW-Madison)
11/18 911 Van Vleck Parvathi Kooloth (UW-Madison)
11/25 NO TALK THANKSGIVING WEEK
12/2 Virtual and 911 Van Vleck Jenny Yeon (Applied Scientist at Amazon)



Abstracts

9/23 Thomas Anderson: I'll talk a bit about potential theory as it is used today in the solution, via boundary integral equations / the boundary element method, of linear PDEs. These aren't only a numerical approach: I'll say a few words too about how they can be used to do analysis on problems. Then I may say a few things about volumetric potential theory: what are the problems there I've been thinking about, and application studies in mixing, for example, that they enable. Finally, I'll be happy to talk a bit about my experience so far in academia.

9/30 Jeff Hammond: Jeff Hammond is a principal engineer with NVIDIA based in Helsinki, Finland, where his focus is developing better ways to write software for numerical algorithms. From 2014 to 2021, Jeff worked for Intel in Portland, Oregon; he started in the research organization and moved to the data center business group. Prior to that he worked for Argonne National Laboratory, first as a postdoc and then as a scientist in the supercomputing facility. Jeff was a graduate student at the University of Chicago and focused on developing open-source chemistry simulation software with Karol Kowalski at Pacific Northwest National Laboratory.  He majored in chemistry and mathematics at the University of Washington.  Details can be found on Jeff's home page: https://jeffhammond.github.io/.

10/7 Jie Wang: We study distributionally robust optimization with Sinkhorn distance -- a variant of Wasserstein distance based on entropic regularization. We derive convex programming dual reformulations when the nominal distribution is an empirical distribution and a general distribution, respectively. Compared with Wasserstein DRO, it is computationally tractable for a larger class of loss functions, and its worst-case distribution is more reasonable. We propose an efficient stochastic mirror descent algorithm to solve the dual reformulation with provable convergence guarantees. Finally, we provide various numerical examples using both synthetic and real data to demonstrate its competitive performance and light computation cost.

10/12 Matt Reuter: As children, most of us didn't know what we wanted to be "when we grew up," and, when asked, might have said, "an astronaut" or "a firefighter." I wanted to be a Ghost Buster and, pragmatically, wound up in computational chemistry and applied mathematics. In this talk, I'll discuss the winding path of my career from school to the national laboratory system to tenure-track faculty to teaching-line faculty. Along the way I’ll discuss my work exorcising (1) numerical ghosts from nanoscience research and (2) psychological ghosts from students when teaching mathematics.

Past Semesters