SIAM Student Chapter Seminar: Difference between revisions
Line 44: | Line 44: | ||
|10/28 | |10/28 | ||
|911 Van Vleck | |911 Van Vleck | ||
|Yinling Zhang (UW-Madison) | |[https://ylzhang2447.github.io/ Yinling Zhang] (UW-Madison) | ||
|A Causality-Based Learning Approach for Discovering the Underlying Dynamics of Complex Systems from Partial Observations with Stochastic Parameterization | |A Causality-Based Learning Approach for Discovering the Underlying Dynamics of Complex Systems from Partial Observations with Stochastic Parameterization | ||
|- | |- | ||
Line 72: | Line 72: | ||
| | | | ||
|} | |} | ||
==Abstracts== | ==Abstracts== |
Revision as of 13:20, 26 October 2022
- When: Fridays at 1 PM unless noted otherwise
- Where: 9th floor lounge (we will also broadcast the virtual talks on the 9th floor lounge with refreshments)
- Organizers: Evan Sorensen, Jordan Radke, Peiyi Chen, and Yahui Qu
- Faculty advisers: Jean-Luc Thiffeault, Steve Wright
- To join the SIAM Chapter mailing list: email siam-chapter+join@g-groups.wisc.edu.
- Zoom link: https://uwmadison.zoom.us/j/99844791267?pwd=eUFwM25Hc2Roc1kvSzR3N2tVVlpLQT09
- Passcode: 641156
Fall 2022
Date (1 PM unless otherwise noted) | Location | Speaker | Title |
---|---|---|---|
9/23 | Virtual and 911 Van Vleck | Thomas Anderson (University of Michigan) | A few words on potential theory in modern applied math |
9/30 (11 AM) | Virtual and 911 Van Vleck | Jeff Hammond (Principal Engineer at NVIDIA) | Industry talk |
10/7 | Virtual and 911 Van Vleck | Jie Wang (Georgia Institute of Technology) | Sinkhorn Distributionally Robust Optimization |
10/14 | Virtual and 911 Van Vleck | Matt Reuter (Stony Brook University) | Becoming a Ghost Buster |
10/19 (Wednesday at 4 PM) | Virtual and 911 Van Vleck | Ying Li | Industry talk |
10/28 | 911 Van Vleck | Yinling Zhang (UW-Madison) | A Causality-Based Learning Approach for Discovering the Underlying Dynamics of Complex Systems from Partial Observations with Stochastic Parameterization |
11/4 | 911 Van Vleck | Haley Colgate (UW-Madison) | |
11/11 | 911 Van Vleck | Zinan Wang (UW-Madison) | |
11/18 | 911 Van Vleck | Parvathi Kooloth (UW-Madison) | |
11/25 | NO TALK | THANKSGIVING WEEK | |
12/2 | Virtual and 911 Van Vleck | Jenny Yeon (Applied Scientist at Amazon) |
Abstracts
9/23 Thomas Anderson: I'll talk a bit about potential theory as it is used today in the solution, via boundary integral equations / the boundary element method, of linear PDEs. These aren't only a numerical approach: I'll say a few words too about how they can be used to do analysis on problems. Then I may say a few things about volumetric potential theory: what are the problems there I've been thinking about, and application studies in mixing, for example, that they enable. Finally, I'll be happy to talk a bit about my experience so far in academia.
9/30 Jeff Hammond: Jeff Hammond is a principal engineer with NVIDIA based in Helsinki, Finland, where his focus is developing better ways to write software for numerical algorithms. From 2014 to 2021, Jeff worked for Intel in Portland, Oregon; he started in the research organization and moved to the data center business group. Prior to that he worked for Argonne National Laboratory, first as a postdoc and then as a scientist in the supercomputing facility. Jeff was a graduate student at the University of Chicago and focused on developing open-source chemistry simulation software with Karol Kowalski at Pacific Northwest National Laboratory. He majored in chemistry and mathematics at the University of Washington. Details can be found on Jeff's home page: https://jeffhammond.github.io/.
10/7 Jie Wang: We study distributionally robust optimization with Sinkhorn distance -- a variant of Wasserstein distance based on entropic regularization. We derive convex programming dual reformulations when the nominal distribution is an empirical distribution and a general distribution, respectively. Compared with Wasserstein DRO, it is computationally tractable for a larger class of loss functions, and its worst-case distribution is more reasonable. We propose an efficient stochastic mirror descent algorithm to solve the dual reformulation with provable convergence guarantees. Finally, we provide various numerical examples using both synthetic and real data to demonstrate its competitive performance and light computation cost.
10/12 Matt Reuter: As children, most of us didn't know what we wanted to be "when we grew up," and, when asked, might have said, "an astronaut" or "a firefighter." I wanted to be a Ghost Buster and, pragmatically, wound up in computational chemistry and applied mathematics. In this talk, I'll discuss the winding path of my career from school to the national laboratory system to tenure-track faculty to teaching-line faculty. Along the way I’ll discuss my work exorcising (1) numerical ghosts from nanoscience research and (2) psychological ghosts from students when teaching mathematics.
10/19 Ying Li: I will talk about my math background and my current role as a quantitative analytics specialist at Wells Fargo. Different types of quantitative analytics specialist at banking field will be generally introduced along with my opinions of the pros and cons for quantitative analytics jobs in financial area as a math student. I will also share my experience from academia to industry and the desired skill sets to be developed for looking for industry jobs.
10/28 Yinling Zhang: Discovering the underlying dynamics of complex systems from data is an important practical topic. In this paper, a new iterative learning algorithm for complex turbulent systems with partial observations is developed that alternates between identifying model structures, recovering unobserved variables, and estimating parameters. First, a causality-based learning approach is utilized for the sparse identification of model structures, which takes into account certain physics knowledge that is pre-learned from data. Next, a systematic nonlinear stochastic parameterization is built to characterize the time evolution of the unobserved variables. Furthermore, the localization of the state variable dependence and the physics constraints are incorporated into the learning procedure. Numerical experiments show that the new algorithm succeeds in identifying the model structure and providing suitable stochastic parameterizations for many complex nonlinear systems.