Talk_id  Date  Speaker  Title 
29072

Monday 5/3 2:00 PM

Facundo Memoli, The Ohio State University

Some rigidity results for metric spaces via persistent homology
 Facundo Memoli, The Ohio State University
 Some rigidity results for metric spaces via persistent homology
 05/03/2021
 2:00 PM  3:00 PM
 Online (virtual meeting)
 Shelley Kandola (kandola2@msu.edu)
Persistence barcodes provide computable signatures for datasets. These signatures absorb both geometric and topological information in a stable manner. One question that has not yet received too much attention is: how strong are these signatures? A related question is that of ascertaining their relationship to other more classical invariants such as curvature. In this talk I will describe some results about characterizing metric spaces via persistence barcodes arising from VietorisRips filtrations. Of particular interest is a relationship which we established linking persistence barcodes to Gromov's filling radius.

29053

Tuesday 5/4 10:00 AM

Amanda Glazer, UC Berkeley

National Mathematics Survey: Addressing the Gender Gap in Undergraduate Mathematics
 Amanda Glazer, UC Berkeley
 National Mathematics Survey: Addressing the Gender Gap in Undergraduate Mathematics
 05/04/2021
 10:00 AM  11:00 AM
 Online (virtual meeting)
(Virtual Meeting Link)
 Tsvetanka Sendova (tsendova@msu.edu)
In Spring of 2016 the National Mathematics Survey was sent out to five institutions (Harvard, MIT, Princeton, Yale and Brown) to investigate undergraduate climate issues in math departments, in particular with regards to gender. The survey covered a wide range of topics including college academics, advising/research, study habits, mathematics department community, family background, and mentorship. The quantitative data from the survey indicated that the barrier to entry is much higher for women than men who intend to study mathematics, and the qualitative data illustrated a number of issues including gender stereotypes, pressure to represent and disrespect for intelligence. In this talk, I will present the main results from this survey as well as discuss recommendations and changes made in the Harvard mathematics department to address some of these issues.

29065

Tuesday 5/4 2:50 PM

Yury Belousov, Higher School of Economics (HSE)

On the question of genericity of hyperbolic knots
 Yury Belousov, Higher School of Economics (HSE)
 On the question of genericity of hyperbolic knots
 05/04/2021
 2:50 PM  3:40 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Matthew Edward Hedden (heddenma@msu.edu)
(based on a joint work with A. Malyutin)
Thurston’s famous classification theorem, of 1978, states that a nontoric nonsatellite knot is hyperbolic, that is, its complement admits a complete hyperbolic metric of finite volume. Until recently there was the conjecture (known as Adams conjecture) saying that the percentage of hyperbolic knots amongst all the prime knots of n or fewer crossings approaches 100 as n approaches infinity. In 2017 Malyutin showed that this statement contradicts several other plausible conjectures. Finally, in 2019 Adams conjecture was found to be false. In this talk we are going to discuss the key ingredients of its disproof

29074

Wednesday 5/5 4:10 PM

Rami Fakhry, MSU

Canceled due to health issue of the speaker
 Rami Fakhry, MSU
 Canceled due to health issue of the speaker
 05/05/2021
 4:10 PM  5:00 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Dapeng Zhan (zhan@msu.edu)
For a Chordal SLE$_\kappa$ ($\kappa \in (0,8)$) curve in a simply connected domain $D$ with smooth boundary, the $n$point boundary Green's function valued at distinct points $z_1, ..., z_n\in \partial{D}$ is defined by}
\[ G(z_1,...,z_n)= \lim_{r_1,...,r_n \to 0+} \prod_{j=1}^{n} {r_j}^ { \alpha} \mathbb{P} \left[ dist(\gamma, z_k) \leqq r_k, 1 \leqq k \leqq n \right] ,\]where $ \alpha = \frac{8}{\kappa}  1 $ is the boundary exponent of SLE$_\kappa$, provided that the limit converges. In this talk, we will show that such Green's function exists for any finite number of points. Along the way we provide the rate of convergence and modulus of continuity for Green's functions as well. Finally, we give uptoconstant bounds for them.
We use the same zoom link and passcode as before.

29055

Thursday 5/6 4:30 AM

Michael KwokPo Ng, University of Hong Kong

Low Rank Tensor Completion with Poisson Observations
 Michael KwokPo Ng, University of Hong Kong
 Low Rank Tensor Completion with Poisson Observations
 05/06/2021
 4:30 AM  5:30 AM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova ()
Poisson observations for videos are important models in video processing and computer vision. In this talk, we study the thirdorder tensor completion problem with Poisson observations. The main aim is to recover a tensor based on a small number of its Poisson observation entries. An existing matrixbased method may be applied to this problem via the matricized version of the tensor. However, this method does not leverage on the global lowrankness of a tensor and may be substantially suboptimal. We employ a transformed tensor nuclear norm ball constraint and a bounded constraint of each entry, where the transformed tensor nuclear norm is used to get a lower transformed multirank tensor with suitable unitary transformation matrices. We show that the upper bound of the error of the estimator of the proposed model is less than that of the existing matrixbased method. Numerical experiments on synthetic data and realworld datasets are presented to demonstrate the effectiveness of our proposed model compared with existing tensor completion methods.

29057

Thursday 5/13 2:30 PM

Simone Brugiapaglia, Concordia University

The curse of dimensionality and the blessings of sparsity and Monte Carlo sampling: From polynomial to deep neural network approximation in high dimensions
 Simone Brugiapaglia, Concordia University
 The curse of dimensionality and the blessings of sparsity and Monte Carlo sampling: From polynomial to deep neural network approximation in high dimensions
 05/13/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
Approximating multidimensional functions from pointwise samples is a ubiquitous task in data science and scientific computing. This task is made intrinsically difficult by the presence of four contemporary challenges: (i) the target function is usually defined over a high or infinitedimensional domain; (ii) generating samples can be very expensive; (iii) samples are corrupted by unknown sources of errors; (iv) the target function might take values in a function space. In this talk, we will show how these challenges can be substantially overcome thanks to the "blessings" of sparsity and Monte Carlo sampling.
First, we will consider the case of sparse polynomial approximation via compressed sensing. Focusing on the case where the target function is smooth (e.g., holomorphic), but possibly highly anisotropic, we will show how to obtain sample complexity bounds only mildly affected by the curse of dimensionality, nearoptimal accuracy guarantees, stability to unknown errors corrupting the data, and rigorous convergence rates of algebraic and exponential type.
Then, we will illustrate how the mathematical toolkit of sparse polynomial approximation via compressed sensing can be employed to obtain a practical existence theorem for Deep Neural Network (DNN) approximation of highdimensional Hilbertvalued functions. This result shows not only the existence of a DNN with desirable approximation properties, but also how to compute it via a suitable training procedure in order to achieve bestinclass performance guarantees.
We will conclude by discussing open research questions.
The presentation is based on joint work with Ben Adcock, Casie Bao, Nick Dexter, Sebastian Moraga, and Clayton G. Webster.

29075

Monday 5/17 2:00 PM

Ashleigh Thomas, Georgia Tech

The shape of worm behavior
 Ashleigh Thomas, Georgia Tech
 The shape of worm behavior
 05/17/2021
 2:00 PM  3:00 PM
 Online (virtual meeting)
 Shelley Kandola (kandola2@msu.edu)
We apply sliding window embeddings and persistent homology to video data of the behavior and locomotion of C. elegans, a worm that is a widelystudied model organism in biology. The goal is to produce a quantitative and interpretable summary of complex behavior to use as a measurement tool for future biological experiments.
We distinguish and classify videos from various environmental conditions, analyze the tradeoff between accuracy and interpretability, and use representative cycles of persistent homology to produce synthetic videos of stereotypical behaviors. The talk will end with new data that shows the significant variation that can occur in worm behavior and a discussion on adapting TDA methods to data that is sometimes far from periodic.

29056

Thursday 5/20 4:30 AM

Philipp Grohs, University of Vienna

A Proof of the Theory to Practice Gap in Deep Learning
 Philipp Grohs, University of Vienna
 A Proof of the Theory to Practice Gap in Deep Learning
 05/20/2021
 4:30 AM  5:30 AM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova ()
It is now well understood that the approximation properties of neural networks surpass those of virtually all other known approximation methods. On the other hand the numerical realization of these theoretical approximation properties seems quite challenging. Recent empirical results suggest that there might exist a tradeoff between the superior approximation properties and efficient algorithmic realizability of neural network based methods. We prove the existence of this tradeoff by establishing hardness results for the problems of approximation and integration on neural network approximation spaces. These results in particular show that high approximation rates cannot be algorithmically realized by commonly used algorithms such as stochastic gradient descent and its variants. (joint work with Felix Voigtlaender)

29082

Monday 5/24 7:00 PM

Katharine Turner, Australian National University

Generalisations of the Rips Filtration for quasimetric spaces and asymmetric functions with corresponding stability results
 Katharine Turner, Australian National University
 Generalisations of the Rips Filtration for quasimetric spaces and asymmetric functions with corresponding stability results
 05/24/2021
 7:00 PM  8:00 PM
 Online (virtual meeting)
 Shelley Kandola (kandola2@msu.edu)
Rips filtrations over a finite metric space and their corresponding persistent homology are prominent methods in Topological Data Analysis to summarize the ``shape'' of data. For finite metric space $X$ and distance $r$ the traditional Rips complex with parameter $r$ is the flag complex whose vertices are the points in $X$ and whose edges are $\{[x,y]: d(x,y)\leq r\}$. From considering how the homology of these complexes evolves as we increase $r$ we can create persistence modules (and their associated barcodes and persistence diagrams). Crucial to their use is the stability result that says if $X$ and $Y$ are finite metric space then the bottleneck distance between persistence modules constructed by the Rips filtration is bounded by $2d_{GH}(X,Y)$ (where $d_{GH}$ is the GromovHausdorff distance). Using the asymmetry we construct four different constructions analogous to the persistent homology of the Rips filtration and show they also are stable with respect to a natural generalisation of the GromovHasdorff distance called the correspondence distortion distance. These different constructions involve orderedtuple homology, symmetric functions of the distance function, strongly connected components and poset topology.

29070

Thursday 5/27 2:30 PM

Akram Aldroubi, Venderbilt University

Dynamical Sampling: A Sampling Tour
 Akram Aldroubi, Venderbilt University
 Dynamical Sampling: A Sampling Tour
 05/27/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
Dynamical sampling is a term describing an emerging set of problems related to recovering signals and evolution operators from spacetime samples. For example, one problem is to recover a function f from spacetime samples {(A_{t_i}f)(x_i)} of its time evolution f_t = (A_t f)(x), where {A_t}_{t∈T} is a known evolution operator and {(xi,ti)} ⊂ R^d × R^+ .
Another example is a system identification problem when both f and the evolution family {A_t}_{t∈T} are unknown. Applications of dynamical sampling include inverse problems in sensor networks, and source term recovery from physically driven evolutionary systems. Dynamical sampling problems are tightly connected to frame theory as well as more classical areas of mathematics such as approximation theory, and functional analysis. In this talk, I will describe a few problems in dynamical sampling, some results and open problems.

29071

Thursday 6/3 2:30 PM

Gerlind PlonkaHoch , University of Göttingen

Recovery of sparse signals from their Fourier coefficients
 Gerlind PlonkaHoch , University of Göttingen
 Recovery of sparse signals from their Fourier coefficients
 06/03/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
In this talk, we study a new recovery procedure for nonharmonic signals, or more generally for extended exponential sums y(t), which are determined by a finite number of parameters. For the reconstruction we employ a finite set of classical Fourier coefficients of y(t). Our new recovery method is based on the observation that the Fourier coefficients of y(t) possess a special rational structure. We apply the recently proposed AAA algorithm by Nakatsukasa et al. (2018) to recover this rational structure in a stable way. If a sufficiently large set of Fourier coefficients of y(t) is available, then our recovery method automatically detects the correct number of terms M of the exponential sums y(t) and reconstructs all unknown parameters of the signal model. Our method provides a new stable alternative to the known numerical approaches for the recovery of exponential sums that are based on Prony's method.
These results have been obtained jointly with Markus Petz and Nadiia Derevianko.

29076

Thursday 6/10 2:30 PM

Piotr Indyk, MIT

LearningBased Sampling and Streaming
 Piotr Indyk, MIT
 LearningBased Sampling and Streaming
 06/10/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
Classical algorithms typically provide "one size fits all" performance, and do not leverage properties or patterns in their inputs. A recent line of work aims to address this issue by developing algorithms that use machine learning predictions to improve their performance. In this talk I will present two examples of this type, in the context of streaming and sampling algorithms. In particular, I will show how to use machine learning predictions to improve the performance of (a) lowmemory streaming algorithms for frequency estimation (ICLR’19), and (b) sampling algorithms for estimating the support size of a distribution (ICLR’21). Both algorithms use an MLbased predictor that, given a data item, estimates the number of times the item occurs in the input data set.
The talk will cover material from papers coauthored with T Eden, CY Hsu, D Katabi, S Narayanan, R Rubinfeld, S Silwal, T Wagner and A Vakilian.

29084

Monday 6/14 2:00 PM

Rui Wang, Michigan State University

Persistent Spectral Graphs
 Rui Wang, Michigan State University
 Persistent Spectral Graphs
 06/14/2021
 2:00 PM  3:00 PM
 Online (virtual meeting)
 Shelley Kandola (kandola2@msu.edu)
Persistent homology (PH) is one of the most popular tools in topological data analysis (TDA), while graph theory has had a significant impact on data science. This work introduces persistent spectral graph (PSG) theory to create a unified lowdimensional multiscale paradigm for revealing topological persistence and extracting geometric shapes from highdimensional datasets. In PSG theory, families of persistent Laplacian matrices (PLMs) corresponding to various topological dimensions are constructed via a filtration to sample a given dataset at multiple scales. The harmonic spectra from the null spaces of PLMs offer the same topological invariants, namely persistent Betti numbers, at various dimensions as those provided by PH, while the nonharmonic spectra of PLMs give rise to additional geometric analysis of the shape of the data. We develop an opensource software package, called highly efficient robust multidimensional evolutionary spectra (HERMES) as well, to enable broad applications of PSGs in biology, engineering, and technology.

29077

Thursday 6/17 2:30 PM

Wenjing Liao, Georgia Tech

Regression and doubly robust offpolicy learning on lowdimensional manifolds by neural networks
 Wenjing Liao, Georgia Tech
 Regression and doubly robust offpolicy learning on lowdimensional manifolds by neural networks
 06/17/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
Many data in realworld applications are in a highdimensional space but exhibit lowdimensional structures. In mathematics, these data can be modeled as random samples on a lowdimensional manifold. Our goal is to estimate a target function or learn an optimal policy using neural networks. This talk is based on an efficient approximation theory of deep ReLU networks for functions supported on a lowdimensional manifold. We further establish the sample complexity for regression and offpolicy learning with finite samples of data. When data are sampled on a lowdimensional manifold, the sample complexity crucially depends on the intrinsic dimension of the manifold instead of the ambient dimension of the data. These results demonstrate that deep neural networks are adaptive to lowdimensional geometric structures of data sets. This is a joint work with Minshuo Chen, Haoming Jiang, Liu Hao, Tuo Zhao at Georgia Institute of Technology.

29085

Monday 6/21 2:00 PM

Veronica Ciocanel, Duke University

"Modeling and topological data analysis for biological ring channels"
 Veronica Ciocanel, Duke University
 "Modeling and topological data analysis for biological ring channels"
 06/21/2021
 2:00 PM  3:00 PM
 Online (virtual meeting)
 Shelley Kandola (kandola2@msu.edu)
Actin filaments are polymers that interact with motor proteins inside cells and play important roles in cell motility, shape, and development. Depending on its function, this dynamic network of interacting proteins reshapes and organizes in a variety of structures, including bundles, clusters, and contractile rings. Motivated by observations from the reproductive system of the roundworm C. elegans, we use an agentbased modeling framework to simulate interactions between actin filaments and motor proteins inside cells. We also develop tools based on topological data analysis to understand timeseries data extracted from these filamentous network interactions. We use these tools to compare the filament organization resulting from motors with different properties. We are currently interested in gaining insights into myosin motor regulation and the resulting actin architectures during cell cycle progression. This work also raises questions about how to assess the significance of topological features in topological summaries such as persistence diagrams.

29078

Thursday 6/24 2:30 PM

Qiang Ye, University of Kentucky

Batch Normalization and Preconditioning for Neural Network Training
 Qiang Ye, University of Kentucky
 Batch Normalization and Preconditioning for Neural Network Training
 06/24/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
Batch normalization (BN) is a popular and ubiquitous method in deep neural network training that has been shown to decrease training time and improve generalization performance. Despite its success, BN is not theoretically well understood. It is not suitable for use with very small minibatch sizes or online learning. In this talk, we will review BN and present a preconditioning method called Batch Normalization Preconditioning (BNP) to accelerate neural network training. We will analyze the effects of minibatch statistics of a hidden variable on the Hessian matrix of a loss function and propose a parameter transformation that is equivalent to normalizing the hidden variables to improve the conditioning of the Hessian. Compared with BN, one benefit of BNP is that it is not constrained on the minibatch size and works in the online learning setting. We will present several experiments demonstrating competitiveness of BNP. Furthermore, we will discuss a connection to BN which provides theoretical insights on how BN improves training and how BN is applied to special architectures such as convolutional neural networks.
The talk is based on a joint work with Susanna Lange and Kyle Helfrich.

29086

Monday 6/28 2:00 PM

Lori Ziegelmeier, Macalester College

Capturing Dynamics of TimeVarying Data via Topology
 Lori Ziegelmeier, Macalester College
 Capturing Dynamics of TimeVarying Data via Topology
 06/28/2021
 2:00 PM  3:00 PM
 Online (virtual meeting)
 Shelley Kandola (kandola2@msu.edu)
One approach to understanding complex data is to study its shape through the lens of algebraic topology. While the early development of topological data analysis focused primarily on static data, in recent years, theoretical and applied studies have turned to data that varies in time. A timevarying collection of metric spaces as formed, for example, by a moving school of fish or flock of birds, can contain a vast amount of information. There is often a need to simplify or summarize the dynamic behavior. One such method is a crocker plot, a 2dimensional image that displays the (nonpersistent but varying with scale) topological information at all times simultaneously. We use this method to perform exploratory data analysis and investigate parameter recovery via machine learning in the collective motion model of D’Orsogna et al. (2006). Then, we use it to choose between unbiased correlated random walk models of Nilsen et al. (2013) that describe motion tracking experiments on pea aphids. We then introduce a new tool to summarize timevarying metric spaces: a crocker stack. Crocker stacks are convenient for visualization, amenable to machine learning, and satisfy a stability property.

29083

Wednesday 7/14 10:00 AM

Round table conversation

Preparing for Fall  teaching modalities
 Round table conversation
 Preparing for Fall  teaching modalities
 07/14/2021
 10:00 AM  12:00 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Tsvetanka Sendova (tsendova@msu.edu)
No abstract available.

29079

Thursday 7/15 2:30 PM

Qi (Rose) Yu , University of California, San Diego

University of California, San Diego
 Qi (Rose) Yu , University of California, San Diego
 University of California, San Diego
 07/15/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
Applications such as climate science and transportation require learning complex dynamics from largescale spatiotemporal data. Existing machine learning frameworks are still insufficient to learn spatiotemporal dynamics as they often fail to exploit the underlying physics principles. Representation theory can be used to describe and exploit the symmetry of the dynamical system. We will show how to design neural networks that are equivariant to various symmetries for learning spatiotemporal dynamics. Our methods demonstrate significant improvement in prediction accuracy, generalization, and sample efficiency in forecasting turbulent flows and predicting realworld trajectories. This is joint work with Robin Walters, Rui Wang, and Jinxi Li.

29080

Thursday 7/22 2:30 PM

Yaniv Plan, University of British Columbia

A family of measurement matrices for generalized compressed sensing
 Yaniv Plan, University of British Columbia
 A family of measurement matrices for generalized compressed sensing
 07/22/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
We consider the problem of recovering a structured signal x that lies close to a subset of interest T in R^n, from its random noisy linear measurements y = B A x + w, i.e., a generalization of the classic compressed sensing problem. Above, B is a fixed matrix and A has independent subgaussian rows. By varying B, and the subgaussian distribution of A, this gives a family of measurement matrices which may have heavy tails, dependent rows and columns, and singular values with large dynamic range. Typically, generalized compressed sensing assumes a random measurement matrix with nearly uniform singular values (with high probability), and asks: How many measurements are needed to recover x? In our setting, this question becomes: What properties of B allow good recovery? We show that the “effective rank” of B may be used as a surrogate for the number of measurements, and if this exceeds the squared Gaussian complexity of TT then accurate recovery is guaranteed. We also derive the optimal dependence on the subgaussian norm of the rows of A, to our knowledge this was not known previously even in the case when B is the identity. We allow model mismatch and inexact optimization with the aim of creating an easily accessible theory that specializes well to the case when T is the range of an expansive neural net.

29088

Monday 7/26 2:00 PM

Mengsen Zhang, UNC Chapel Hill

Transitions and their topological signatures in social and brain dynamics
 Mengsen Zhang, UNC Chapel Hill
 Transitions and their topological signatures in social and brain dynamics
 07/26/2021
 2:00 PM  3:00 PM
 Online (virtual meeting)
 Shelley Kandola (kandola2@msu.edu)
Understanding biological and social interactions in a dynamical system framework amounts to identifying stable patterns of activity and tracking transitions between them. When the system’s dimension is low (few
interacting elements), one can often visually inspect the time series and write down the relevant variables and differential equations. Once we venture into higher dimensions, this traditional modeling approach becomes intractable: what constitutes a “stable pattern” or a “transition” in the empirical data depends on the spatiotemporal scales at which we look at the system. We show how persistent homology serves as a natural tool for characterizing such multiscale dynamics in rhythmic social interaction. Furthermore, a key goal of studying transitions is to understand the relations between distinct stable dynamic patterns – they
provide access to the global organization of the dynamical system. We show how existing topological data analysis (TDA) tools can be adapted to understand the network of transitions in brain dynamics. The talk focuses on the empirical and dynamicalsystem context in which TDA is applied in the hope of eliciting new
conversations across the boundary of empirical science, dynamical systems modeling, and TDA.

29091

Wednesday 7/28 10:00 AM

Round table conversation

Preparing for Fall  teaching modalities  Part II
 Round table conversation
 Preparing for Fall  teaching modalities  Part II
 07/28/2021
 10:00 AM  12:00 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Tsvetanka Sendova (tsendova@msu.edu)
No abstract available.

29081

Thursday 7/29 2:30 PM

Wotao Yin, UCLA

Learning to Optimize: FixedPoint Network
 Wotao Yin, UCLA
 Learning to Optimize: FixedPoint Network
 07/29/2021
 2:30 PM  3:30 PM
 Online (virtual meeting)
(Virtual Meeting Link)
 Olga Turanova (turanova@msu.edu)
Many applications require repeatedly solving a certain optimization problem, each time with new but similar data. “Learning to optimize” or L2O is an approach to develop algorithms that solve these similar problems very efficiently. L2Ogenerated algorithms have achieved significant success in signal processing and inverseproblem applications. This talk introduces the motivation for L2O and gives a quick overview of different types of L2O approaches for continuous optimization. Then, we will introduce Fixed Point Networks (FPNs), which incorporate fixedpoint iterations into deep neural networks and provide abilities such as physicsbased inversion, datadriven regularization, encoding hard constraints, and infinite depth. The FPNs are easy to train with a new Jacobianfree back propagation (JFB) scheme.

29087

Thursday 8/12 2:30 PM

Andrea Bertozzi, UCLA

TBA
