Algebra in Statistics and Computation Seminar: Difference between revisions
Jrodriguez43 (talk | contribs) No edit summary |
|||
(28 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
'''When''': 1:30-2: | '''When''': 1:30-2:40, TBD (1:30-1:40pm Social Chit-Chat, 1:40-2:20 Lead Presenter; 2:20+ Written Summary) | ||
'''Where''': | '''Where''': [https://uwmadison.zoom.us/j/94815615580?pwd=bkt1dFBCdFlXUHVNZTNaMDFNRytUQT09 Virtual] | ||
'''Contact''': [https://www.math.wisc.edu/~jose/ Jose Israel Rodriguez] | '''Contact''': [https://www.math.wisc.edu/~jose/ Jose Israel Rodriguez]. | ||
'''Remark''': This informal seminar is held on the | '''Remark''': This informal seminar is held sporadically. | ||
== Summer 2023 == | |||
'''Seminar abstract''': Algebraic statistics studies statistical models through the lens of algebra, geometry, and combinatorics. From model selection to inference, this interdisciplinary field has seen applications in a wide range of statistical procedure and motivated many questions in algebraic geometry. This seminar series is open to topics on new developments in algebraic statistics by making connections to the workshops at the [https://www.imsi.institute/activities/algebraic-statistics-and-our-changing-world/ Fall 2023 IMSI program]. | |||
'''Structure:''' Each seminar will have a lead participant who presents for 30-40 minutes to give the background/context/motivation for a recent result by a speaker at the [https://www.imsi.institute/activities/algebraic-statistics-and-our-changing-world/ IMSI Alg Stat Program]. The other participants are expected on the day before the seminar to (1) read as much of the reference as possible and (2) to send the lead participant questions/comments/concerns about the reference. Toward the end of the meeting, everyone will work on an Overleaf document to summarize results and document open questions. | |||
{| | |||
! align="left" | Date (June - Aug) | |||
! align="left" | Lead Participant | |||
! align="left" | Reference(s) Link | |||
! align="left" | Authors | |||
!IMSI Workshop | |||
|- | |||
|June 22* | |||
|Meet and Greet | |||
| | |||
| | |||
| | |||
|- | |||
|June 29* | |||
|[https://www.ikennanometa.com/home Ikenna Nometa] | |||
| | |||
| | |||
| | |||
|- | |||
|July 6* | |||
|[https://math.cornell.edu/joy-zhang Joy Zhang] | |||
| | |||
| | |||
|[https://www.imsi.institute/activities/algebraic-statistics-for-ecological-and-biological-systems/ Ecological and Biological Systems] | |||
|- | |||
|July 20 | |||
|Max Hill | |||
| | |||
| | |||
| | |||
|- | |||
|Tentative dates* | |||
| | |||
| | |||
| | |||
| | |||
|} | |||
== Spring 2021 Schedule == | == Spring 2021 Schedule == | ||
Line 13: | Line 55: | ||
!align="left" | Speaker | !align="left" | Speaker | ||
!align="left" | Title | !align="left" | Title | ||
!align="left" | | !align="left" | | ||
|- | |- | ||
|February 11 | |February 11 | ||
Line 20: | Line 62: | ||
|- | |- | ||
|March 11 | |March 11 | ||
|[http://www.luke-amendola.appspot.com/ Carlos Amendola (ULM)] | |[http://www.luke-amendola.appspot.com/ Carlos Amendola Ceron (ULM)] | ||
| | |Likelihood Geometry of Correlation Models | ||
| | | | ||
|- | |- | ||
|April 8 | |April 8 | ||
|[https:// | |[https://emduart2.github.io/ Eliana Duarte (OVGU)] | ||
| | |Log-linear Models with Rational Maximum Likelihood Estimator in Dimension Two | ||
| | | | ||
|- | |- | ||
Line 32: | Line 74: | ||
== Abstracts == | == Abstracts == | ||
===April 8: Eliana Duarte=== | |||
Title: Log-linear Models with Rational Maximum Likelihood Estimator in Dimension Two | |||
References: https://arxiv.org/abs/1903.06110 | |||
Abstract: A discrete statistical model has rational maximum estimator(MLE) whenever the MLE is a rational function of the data. When a model has rational MLE, this rational function has a very specific form known as Horn uniformization. I will explain how to obtain the Horn matrix for a wide class of discrete models known as staged trees and we will characterize the Horn matrices of log-linear models in dimension two. This builds on connections between Algebraic Statistics and Geometric Modeling studied by Clarke and Cox. This talk will be based on previous work with Bernd Sturmfels and Orlando Marigliano https://arxiv.org/abs/1903.06110 and ongoing work with Isobel Davies, Irem Portakal and Stefana Miruna Sorea. | |||
===March 11: Carlos Amendola Ceron=== | |||
Title: Likelihood Geometry of Correlation Models. | |||
References: https://arxiv.org/pdf/2012.03903.pdf | |||
Abstract: Correlation matrices are standardized covariance matrices. They form an affine space of symmetric matrices defined by setting the diagonal entries to one. In this talk we will consider the fascinating geometry of maximum likelihood estimation for this model and linear submodels that encode additional symmetries. We also consider the problem of minimizing two closely related functions of the covariance matrix, the Stein's loss and the symmetrized Stein's loss, which lead naturally to the algebraic statistical concepts of dual ML degree and SSL degree. I will also present exciting open problems in this direction. | |||
This is joint work with [http://www.econ.upf.edu/~piotr/ Piotr Zwiernik]. | |||
===February 11: Elina Robeva=== | ===February 11: Elina Robeva=== | ||
Title: Hidden Variables in Linear Causal Models. | Title: Hidden Variables in Linear Causal Models. | ||
References: https://arxiv.org/abs/1807.07561 https://arxiv.org/abs/2001.10426, and https://arxiv.org/abs/2010.05306 | |||
Abstract: Identifying causal relationships between random variables from observational data is an important hard problem in many areas of data science. The presence of hidden variables, though quite realistic, pauses a variety of further problems. Linear structural equation models, which express each variable as a linear combination of all of its parent variables, have long been used for learning causal structure from observational data. Surprisingly, when the variables in a linear structural equation model are non-Gaussian the full causal structure can be learned without interventions, while in the Gaussian case one can only learn the underlying graph up to a Markov equivalence class. In this talk, we first discuss how one can use high-order cumulant information to learn the structure of a linear non-Gaussian structural equation model with hidden variables. While prior work posits that each hidden variable is the common cause of two observed variables, we allow each hidden variable to be the common cause of multiple observed variables. Next, we discuss hidden variable Gaussian causal models and the difficulties that arise with learning those. We show it is hard to even describe the Markov equivalence classes in this case, and we give a semi algebraic description of a large class of these models. | Abstract: Identifying causal relationships between random variables from observational data is an important hard problem in many areas of data science. The presence of hidden variables, though quite realistic, pauses a variety of further problems. Linear structural equation models, which express each variable as a linear combination of all of its parent variables, have long been used for learning causal structure from observational data. Surprisingly, when the variables in a linear structural equation model are non-Gaussian the full causal structure can be learned without interventions, while in the Gaussian case one can only learn the underlying graph up to a Markov equivalence class. In this talk, we first discuss how one can use high-order cumulant information to learn the structure of a linear non-Gaussian structural equation model with hidden variables. While prior work posits that each hidden variable is the common cause of two observed variables, we allow each hidden variable to be the common cause of multiple observed variables. Next, we discuss hidden variable Gaussian causal models and the difficulties that arise with learning those. We show it is hard to even describe the Markov equivalence classes in this case, and we give a semi algebraic description of a large class of these models. | ||
---- | ---- | ||
== Other events to note == | == Other events to note == | ||
{| cellpadding="8" | {| cellpadding="8" | ||
Line 42: | Line 105: | ||
!align="left" | event/title | !align="left" | event/title | ||
!align="left" | location/speaker | !align="left" | location/speaker | ||
!align="left" | info | |||
|- | |- | ||
|Fourth Thursday's of the month | |Fourth Thursday's of the month | ||
|[https://www.math.wisc.edu/wiki/index.php/Applied_Algebra_Seminar Applied Algebra Seminar | |[https://www.math.wisc.edu/wiki/index.php/Applied_Algebra_Seminar Applied Algebra Seminar, UW Madison] | ||
|Virtual | |In person | ||
|- | |||
|Second Tuesday of the month, 10am | |||
|[http://wiki.siam.org/siag-ag/index.php/Webinar SIAM SAGA] | |||
|Virtual: [https://www.youtube.com/playlist?list=PLf_ipOSbWC84HloBwtq3vVJKYE4rwfeSs Recordings] | |||
|[https://siam.zoom.us/webinar/register/WN_nMdM3GXHTTuZyjn5fGf4jA Registration] needed once. | |||
|- | |- | ||
| | |Biweekly Mondays | ||
|[https://sites.google.com/view/algstatsonline/home Algebraic Statistics Online Seminar (ASOS)] | |[https://sites.google.com/view/algstatsonline/home Algebraic Statistics Online Seminar (ASOS)] | ||
|Virtual: [https://sites.google.com/view/algstatsonline/past-talks-and-recordings Recordings] | |||
|Mailing list [https://lists.lrz.de/mailman/listinfo/algstatsonline sign-up] for Zoom-links | |||
|- | |||
|Fall 2020 | |||
|[https://www.math.wisc.edu/~jose/ASC ASC Seminar] | |||
|Virtual | |Virtual | ||
|- | |- | ||
| | |More events | ||
|[https | |[https://www.math.wisc.edu/wiki/index.php/Applied_Algebra_Seminar_Spring_2021#Related_events_to_note are listed here] | ||
|- | |- | ||
|} | |} |
Latest revision as of 18:06, 29 June 2023
When: 1:30-2:40, TBD (1:30-1:40pm Social Chit-Chat, 1:40-2:20 Lead Presenter; 2:20+ Written Summary)
Where: Virtual
Contact: Jose Israel Rodriguez.
Remark: This informal seminar is held sporadically.
Summer 2023
Seminar abstract: Algebraic statistics studies statistical models through the lens of algebra, geometry, and combinatorics. From model selection to inference, this interdisciplinary field has seen applications in a wide range of statistical procedure and motivated many questions in algebraic geometry. This seminar series is open to topics on new developments in algebraic statistics by making connections to the workshops at the Fall 2023 IMSI program.
Structure: Each seminar will have a lead participant who presents for 30-40 minutes to give the background/context/motivation for a recent result by a speaker at the IMSI Alg Stat Program. The other participants are expected on the day before the seminar to (1) read as much of the reference as possible and (2) to send the lead participant questions/comments/concerns about the reference. Toward the end of the meeting, everyone will work on an Overleaf document to summarize results and document open questions.
Date (June - Aug) | Lead Participant | Reference(s) Link | Authors | IMSI Workshop |
---|---|---|---|---|
June 22* | Meet and Greet | |||
June 29* | Ikenna Nometa | |||
July 6* | Joy Zhang | Ecological and Biological Systems | ||
July 20 | Max Hill | |||
Tentative dates* |
Spring 2021 Schedule
Date | Speaker | Title | |
---|---|---|---|
February 11 | Elina Robeva (UBC) | Hidden Variables in Linear Causal Models | |
March 11 | Carlos Amendola Ceron (ULM) | Likelihood Geometry of Correlation Models | |
April 8 | Eliana Duarte (OVGU) | Log-linear Models with Rational Maximum Likelihood Estimator in Dimension Two |
Abstracts
April 8: Eliana Duarte
Title: Log-linear Models with Rational Maximum Likelihood Estimator in Dimension Two
References: https://arxiv.org/abs/1903.06110
Abstract: A discrete statistical model has rational maximum estimator(MLE) whenever the MLE is a rational function of the data. When a model has rational MLE, this rational function has a very specific form known as Horn uniformization. I will explain how to obtain the Horn matrix for a wide class of discrete models known as staged trees and we will characterize the Horn matrices of log-linear models in dimension two. This builds on connections between Algebraic Statistics and Geometric Modeling studied by Clarke and Cox. This talk will be based on previous work with Bernd Sturmfels and Orlando Marigliano https://arxiv.org/abs/1903.06110 and ongoing work with Isobel Davies, Irem Portakal and Stefana Miruna Sorea.
March 11: Carlos Amendola Ceron
Title: Likelihood Geometry of Correlation Models.
References: https://arxiv.org/pdf/2012.03903.pdf
Abstract: Correlation matrices are standardized covariance matrices. They form an affine space of symmetric matrices defined by setting the diagonal entries to one. In this talk we will consider the fascinating geometry of maximum likelihood estimation for this model and linear submodels that encode additional symmetries. We also consider the problem of minimizing two closely related functions of the covariance matrix, the Stein's loss and the symmetrized Stein's loss, which lead naturally to the algebraic statistical concepts of dual ML degree and SSL degree. I will also present exciting open problems in this direction.
This is joint work with Piotr Zwiernik.
February 11: Elina Robeva
Title: Hidden Variables in Linear Causal Models.
References: https://arxiv.org/abs/1807.07561 https://arxiv.org/abs/2001.10426, and https://arxiv.org/abs/2010.05306
Abstract: Identifying causal relationships between random variables from observational data is an important hard problem in many areas of data science. The presence of hidden variables, though quite realistic, pauses a variety of further problems. Linear structural equation models, which express each variable as a linear combination of all of its parent variables, have long been used for learning causal structure from observational data. Surprisingly, when the variables in a linear structural equation model are non-Gaussian the full causal structure can be learned without interventions, while in the Gaussian case one can only learn the underlying graph up to a Markov equivalence class. In this talk, we first discuss how one can use high-order cumulant information to learn the structure of a linear non-Gaussian structural equation model with hidden variables. While prior work posits that each hidden variable is the common cause of two observed variables, we allow each hidden variable to be the common cause of multiple observed variables. Next, we discuss hidden variable Gaussian causal models and the difficulties that arise with learning those. We show it is hard to even describe the Markov equivalence classes in this case, and we give a semi algebraic description of a large class of these models.
Other events to note
date | event/title | location/speaker | info |
---|---|---|---|
Fourth Thursday's of the month | Applied Algebra Seminar, UW Madison | In person | |
Second Tuesday of the month, 10am | SIAM SAGA | Virtual: Recordings | Registration needed once. |
Biweekly Mondays | Algebraic Statistics Online Seminar (ASOS) | Virtual: Recordings | Mailing list sign-up for Zoom-links |
Fall 2020 | ASC Seminar | Virtual | |
More events | are listed here |