BEGIN:VCALENDAR
VERSION:2.0
PRODID:Mathematics Seminar Calendar
BEGIN:VEVENT
UID:20230322T095258-31529@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (Passcode: the smallest prime > 100 ): Three uses of semidefinite programming in approximation theory
DESCRIPTION:Speaker\: Simon Foucart, Texas A&M University\r\nIn this talk, modern optimization techniques are publicized as fitting computational tools to attack several extremal problems from Approximation Theory which had reached their limitations based on purely analytical approaches. Three such problems are showcased: the first problem---minimal projections---involves minimization over measures and exploits the moment method; the second problem---constrained approximation---involves minimization over polynomials and exploits the sum-of-squares method; and the third problem---optimal recovery from inaccurate observations---is highly relevant in Data Science and exploits the S-procedure. In each of these problems, one ends up having to solve semidefinite programs.\r\n
LOCATION:C304 Wells Hall
DTSTART:20230112T193000Z
DTEND:20230112T203000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31529
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31530@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (Passcode: the smallest prime > 100 ): Low rank approximation for faster optimization
DESCRIPTION:Speaker\: Madeleine Udell, Stanford University\r\nLow rank structure is pervasive in real-world datasets. This talk shows how to accelerate the solution of fundamental computational problems, including eigenvalue decomposition, linear system solves, composite convex optimization, and stochastic optimization (including deep learning), by exploiting this low rank structure. We present a simple method based on randomized numerical linear algebra for efficiently computing approximate top eigendecompositions, which can be used to replace large matrices (such as Hessians and constraint matrices) with low rank surrogates that are faster to apply and invert. The resulting solvers for linear systems (NystromPCG), composite convex optimization (NysADMM), and deep learning (SketchySGD) demonstrate strong theoretical and numerical support, outperforming state-of-the-art methods in terms of speed and robustness to hyperparameters.\r\n\r\n
LOCATION:C304 Wells Hall
DTSTART:20230119T193000Z
DTEND:20230119T203000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31530
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31553@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (password the smallest prime > 100) - Towards Intrinsically Low-Dimensional Models in Wasserstein Space: Geometry, Statistics, and Learning
DESCRIPTION:Speaker\: James Murphy, Tufts University\r\nWe consider the problems of efficient modeling and representation learning for probability distributions in Wasserstein space. We consider a general barycentric coding model in which data are represented as Wasserstein-2 (W2) barycenters of a set of fixed reference measures. Leveraging the Riemannian structure of W2-space, we develop a tractable optimization program to learn the barycentric coordinates when given access to the densities of the underlying measures. We provide a consistent statistical procedure for learning these coordinates when the measures are accessed only by i.i.d. samples. Our consistency results and algorithms exploit entropic regularization of the optimal transport problem, thereby allowing our barycentric modeling approach to scale efficiently. We also consider the problem of learning reference measures given observed data. Our regularized approach to dictionary learning in Wasserstein space addresses core problems of ill-posedness and in practice learns interpretable dictionary elements and coefficients useful for downstream tasks. Applications to image and natural language processing will be shown throughout the talk.
LOCATION:C304 Wells Hall
DTSTART:20230202T193000Z
DTEND:20230202T203000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31553
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31569@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (password the smallest prime > 100) - Combining network analysis and persistent homology for classifying behavior of time series
DESCRIPTION:Speaker\: Elizabeth Munch, MSU\r\nPersistent homology, the flagship method of topological data analysis, can be used to provide a quantitative summary of the shape of data. One way to pass data to this method is to start with a finite, discrete metric space (whether or not it arises from a Euclidean embedding) and to study the resulting filtration of the Rips complex. In this talk, we will discuss several available methods for turning a time series into a discrete metric space, including the Takens embedding, $k$-nearest neighbor networks, and ordinal partition networks. Combined with persistent homology and machine learning methods, we show how this can be used to classify behavior in time series in both synthetic and experimental data.
LOCATION:C304 Wells Hall
DTSTART:20230209T193000Z
DTEND:20230209T203000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31569
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31570@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (password the smallest prime > 100) - Learning Individualized Treatment Rules with Many Treatments
DESCRIPTION:Speaker\: Yufeng Liu, University of North Carolina at Chapel Hill\r\nLearning an optimal Individualized Treatment Rule (ITR) is a very important problem in precision medicine. In this talk, we consider the challenge when the number of treatment arms is large, and some groups of treatments in the large treatment space may work similarly for the patients. Motivated by the recent development of supervised clustering, we propose a novel adaptive fusion-based method to cluster the treatments with similar treatment effects together and estimate the optimal ITR simultaneously through a single convex optimization. We establish the theoretical guarantee of recovering the underlying true clustering structure of the treatments for our method. Finally, the superior performance of our method will be demonstrated via both simulations and a real data application on cancer treatment.\r\n\r\nThis is joint work with Haixu Ma and Donglin Zeng at UNC-Chapel Hill.\r\n
LOCATION:C304 Wells Hall
DTSTART:20230223T193000Z
DTEND:20230223T203000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31570
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-29395@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:New approaches in simulation of transition paths
DESCRIPTION:Speaker\: Yuehaw Khoo, U Chicago\r\nTensor method can be used for compressing high-dimensional functions arising from partial differential equations (PDE). In this talk, we focus on using these methods for the simulation of transition processes between metastable states in chemistry applications, for example in molecular dynamics. To this end, we also propose a novel generative modeling procedure using tensor-network without the use of any optimization.
LOCATION:C304 Wells Hall
DTSTART:20230224T210000Z
DTEND:20230224T220000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=29395
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31571@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (password the smallest prime > 100) - The Non-asymptotics of Reinforcement Learning
DESCRIPTION:Speaker\: Yuejie Chi, Carnegie Mellon University\r\nReinforcement learning (RL) is garnering significant interest in recent years due to its success in a wide variety of modern applications. However, theoretical understandings on the non-asymptotic sample and computational efficiencies of RL algorithms remain elusive, and are in imminent need to cope with the ever-increasing problem dimensions. In this talk, we discuss our recent progress that sheds light on understanding the efficacy of popular RL algorithms in finding the optimal policy in tabular Markov decision processes.
LOCATION:C304 Wells Hall
DTSTART:20230302T193000Z
DTEND:20230302T203000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31571
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31579@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (password the smallest prime > 100) - A Matrix-Mimetic Tensor Algebra for Optimal Representations of Multiway Data
DESCRIPTION:Speaker\: Elizabeth Newman, Emory University\r\nBig data has revolutionized the landscape of computational mathematics and has increased the demand for new numerical linear algebra tools to handle the vast amount of data. One crucial task is to efficiently capture inherent structure in data using dimensionality reduction and feature extraction. Tensor-based approaches have gained significant traction in this setting by leveraging multilinear relationships in high-dimensional data. In this talk, we will describe a matrix-mimetic tensor algebra that offers provably optimal compressed representations of multiway data via a family of tensor singular value decompositions (SVDs). Moreover, using the inherited linear algebra properties of this framework, we will prove that these tensor SVDs outperform the equivalent matrix SVD and two closely related tensor decompositions, the Higher-Order SVD and Tensor-Train SVD, in terms of approximation accuracy. Throughout the talk, we will provide numerical examples to support the theory and demonstrate practical efficacy of constructing optimal tensor representations.\r\n\r\nThis presentation will serve as an overview of our PNAS paper "Tensor-tensor algebra for optimal representation and compression of multiway data" (https://www.pnas.org/doi/10.1073/pnas.2015851118).
LOCATION:C304 Wells Hall
DTSTART:20230316T183000Z
DTEND:20230316T193000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31579
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-32610@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:Regulatory Perspective on Artificial Intelligence Integrated Drug Development
DESCRIPTION:Speaker\: Menglun Wang, Food and Drug Administration\r\nThe application of Artificial Intelligence/Machine Learning (AI/ML) in drug development is expanding rapidly. AI/ML have the potential to improve the efficiency of drug development and advance precision medicine. However, there are unique challenges. The presentation will mainly focus on the topic of AI/ML applications in clinical trials, including the following parts:\r\n\r\n1. The increasing numbers of submissions over years.\r\n2. Hot therapeutic areas of AI/ML submissions.\r\n3. Types of analysis and objectives in AI/ML in submissions.\r\n4. Case examples.\r\n5. Challenges and outlooks.\r\n\r\nIn the end of the presentation, opportunities of FDA-ORISE fellowship will be introduced to senior PhD students.\r\n
LOCATION:C304 Wells Hall
DTSTART:20230317T190000Z
DTEND:20230317T200000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=32610
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31533@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:An Overview of High-Order Finite Elements for Thermal Radiative Transfer
DESCRIPTION:Speaker\: Terry Haut, Lawerence Livermore National Lab\r\nIn this talk, I will give an overview of numerical methods for thermal radiative transfer (TRT), with an emphasis on the use of high-order finite elements for their solution. The TRT equations constitute a (6+1)-dimensional set of nonlinear PDEs that describe the interaction of a background material and a radiation field, and their solution is critical for modeling Inertial Confinement Fusion and astrophysics applications. Due to their stiff nature, they are typically discretized implicitly in time, and their solution often accounts for up to 90% of the runtime of multi-physics simulations. I will discuss some recently developed linear solvers, physics-informed preconditioners, and methods for preserving positivity that are used to make the solution to the TRT equations efficient and robust.
LOCATION:C304 Wells Hall
DTSTART:20230317T200000Z
DTEND:20230317T210000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31533
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31580@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (password the smallest prime > 100) - The HRT Conjecture: A call for a numerical approach
DESCRIPTION:Speaker\: Kasso Okoudjou , Tufts University\r\nThe two-scale relation in wavelet analysis dictates that a square-integrable function can be written as a linear combination of scaled and shifted copies of itself. This fact is equivalent to the existence of square-integrable functions whose time-scale shifts are linearly dependent. By contrast, by replacing the scaling operator with a modulation operator one would think that the linear dependency of the resulting time-frequency shifts of a square-integrable function might be easily inferred. However, more than two decades after C.~Heil, J.~Ramanatha, and P.~Topiwala conjectured that any such finite collection of time-frequency shifts of a non-zero square-integrable function on the real line is linearly independent, this problem (the HRT Conjecture) remains unresolved.\r\n \r\nThe talk will give an overview of the HRT conjecture and introduce an inductive approach to investigate it. I will highlight a few methods that have been effective in solving the conjecture in certain special cases. However, despite the origin of the HRT conjecture in Applied and Computational Harmonic Analysis, there is a lack of experimental or numerical methods to resolve it. I will present an attempt to investigate the conjecture numerically.
LOCATION:C304 Wells Hall
DTSTART:20230323T183000Z
DTEND:20230323T193000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31580
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31581@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:Strategic Feature Extraction and Low Dimensional Representation of Games - ZOOM TALK (password the smallest prime > 100)
DESCRIPTION:Speaker\: Alexander Strang, University of Chicago\r\nGames are widely used to test reinforcement learning paradigms, to study competitive systems in economics and biology, and to model decision tasks. Empirical game theory studies games through observation of populations of interacting agents. We introduce a generic low-dimensional embedding scheme that maps agents into a latent space which enables visualization, interpolation, and strategic feature extraction. The embedding can be used for feature extraction since it represents a generic game as a combination of simpler low dimensional games. Through examples, we illustrate that these components may correspond to basic strategic trade-offs. We then show that the embedding scheme can represent all games with bounded payout, or whose payout has finite variance when two agents are sampled at random. We develop a formal approximation theory for the representation, study the stability of the embedding, provide sufficient sampling guidelines, and suggest regularizers which promote independence in the identified features.
LOCATION:C304 Wells Hall
DTSTART:20230406T183000Z
DTEND:20230406T193000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31581
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31582@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (password the smallest prime > 100) - TBA
DESCRIPTION:Speaker\: Marina Meila , University of Washington\r\n
LOCATION:C304 Wells Hall
DTSTART:20230413T183000Z
DTEND:20230413T193000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31582
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31534@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:High-order variational Lagrangian schemes for compressible fluids
DESCRIPTION:Speaker\: Guosheng Fu, University of Notre Dame\r\nWe present a class of high-order variational Lagrangian schemes for compressible fluids using the tool of energetic variational approach (EnVarA). This is the first time that the EnVarA framework has been applied to non isothermal models where temperature effects are non-negligible. We illustrate the main idea using the classical ideal gas model, and construct variational Lagrangian schemes that are conservative and entropy stable using EnVarA. Efficient implicit time stepping is designed so that the time step size is not restricted by the sound speed and the model is robust in the low Mach number case. Ample numerical examples will be presented to show the good performance of the proposed schemes for problems including strong shocks, low Mach number flows and multimaterial flows. This is a joint work with Prof. Chun Liu from IIT.
LOCATION:C304 Wells Hall
DTSTART:20230421T200000Z
DTEND:20230421T210000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31534
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31541@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:TBA
DESCRIPTION:Speaker\: Yulong Xing, Ohio State University\r\nTBA
LOCATION:C304 Wells Hall
DTSTART:20230428T200000Z
DTEND:20230428T210000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31541
END:VEVENT
BEGIN:VEVENT
UID:20230322T095258-31583@math.msu.edu
DTSTAMP:20230322T095258Z
SUMMARY:ZOOM TALK (password the smallest prime > 100) - TBA
DESCRIPTION:Speaker\: Petros Drineas, Purdue University\r\n
LOCATION:C304 Wells Hall
DTSTART:20230504T183000Z
DTEND:20230504T193000Z
URL:https://math.msu.edu/Seminars/TalkView.aspx?talk=31583
END:VEVENT
END:VCALENDAR