epshteyn (at) math.utah.edu)
January 7 (Tuesday). Joint Statistics/Stochastics/Applied Math
Seminar. Time 11pm - 12pm. Room JWB 335.
Speaker: Anna Little, Department of Computational Mathematics, Science and Engineering, Michigan State University
Title: Wavelet Invariants for Statistically Robust Multi-Reference Alignment
Abstract: Motivated by applications such as cryo-electron microscopy, this talk discusses a generalization of the 1-dimensional multi-reference alignment problem. The goal is to recover a hidden signal from many noisy observations of the hidden signal, where each noisy observation includes a random translation, a random dilation, and high additive noise. We propose using a translation-invariant, wavelet-based representation, and analyze the statistical properties of this representation given a large number of independent corruptions of the hidden signal. The wavelet-based representation uniquely defines the power spectrum and is used to apply a data-driven, nonlinear unbiasing procedure, so that the estimate of the hidden signal is robust to high frequency perturbations. After unbiasing the representation, the power spectrum is recovered by solving a convex optimization problem, and thus the target signal is obtained up to an unknown phase. Extensive numerical experiments demonstrate the statistical robustness of this approximation procedure.
January 10 (Friday). Joint Statistics/Stochastics/Applied Math
Seminar. Time 3pm - 4pm. Room LCB 225.
Speaker: Tingran Gao, Department of Statistics, University of Chicago
Title: Multi-Representation Manifold Learning on Fibre Bundles
Abstract: Fibre bundles serve as a natural geometric setting for many learning problems involving non-scalar pairwise interactions. Modeled on a fixed principal bundle, different irreducible representations of the structural group induce many associated vector bundles, encoding rich geometric information for the fibre bundle as well as the underlying base manifold. An intuitive example for such a learning paradigm is phase synchronization---the problem of recovering angles from noisy pairwise relative phase measurements---which is prototypical for a large class of imaging problems in cryogenic electron microscopy (cryo-EM) image analysis. We propose a novel nonconvex optimization formulation for this problem, and develop a simple yet efficient two-stage algorithm that, for the first time, guarantees strong recovery for the phases with high probability. We demonstrate applications of this multi-representation methodology that improve denoising and clustering results for cryo-EM images. This algorithmic framework also extends naturally to general synchronization problems over other compact Lie groups, with a wide spectrum of potential applications.
January 17 (Friday). Joint Stochastics/Applied Math
Seminar. Time 3pm - 4pm. Room LCB 225.
Speaker: Wenpin Tang, Department of Mathematics, UCLA
Title: Functional inequalities of Infinite swapping algorithm: theory and applications
Abstract: Sampling Gibbs measures at low temperature is a very important task but computationally very challenging. Numeric evidence suggest that the infinite-swapping algorithm (isa) is a promising method. The isa can be seen as an improvement of replica methods which are very popular. We rigorously analyze the ergodic properties of the isa in the low temperature regime deducing Eyring-Kramers formulas for the spectral gap (or Poincare constant) and the log-Sobolev constant. Our main result shows that the effective energy barrier can be reduced drastically using the isa compared to the classical over-damped Langevin dynamics. As a corollary we derive a deviation inequality showing that sampling is also improved by an exponential factor. Furthermore, we analyze simulated annealing for the isa and show that isa is again superior to the over-damped Langevin dynamics. This is joint work with Georg Menz, Andre Schlichting and Tianqi Wu.
January 22 (Wednesday). Joint Stochastics/Applied Math
Seminar. Time 4pm - 5pm. Room LCB 222.
Speaker: Wuchen Li, Department of Mathematics, UCLA
Title: Accelerated Information Gradient flow
Abstract: We present a systematic framework for the Nesterov's accelerated gradient flows in the spaces of probabilities embedded with information metrics. Here two metrics are considered, including both the Fisher-Rao metric and the Wasserstein-2 metric. For the Wasserstein-2 metric case, we prove the convergence properties of the accelerated gradient flows, and introduce their formulations in Gaussian families. Furthermore, we propose a practical discrete-time algorithm in particle implementations with an adaptive restart technique. We formulate a novel bandwidth selection method, which learns the Wasserstein-2 gradient direction from Brownian-motion samples. Experimental results including Bayesian inference show the strength of the current method compared with the state-of-the-art. Further connections with inverse problems and data related optimization techniques will be discussed.
January 24 (Friday). Joint Statistics/Stochastics/Applied Math
Seminar. Time 3pm - 4pm. Room LCB 225.
Speaker: Sui Tang, Department of Mathematics, Johns Hopkins University
Title: Learning interaction laws in agent based systems
Abstract: Interacting agent systems are ubiquitous in science, from the modeling of particles in physics to prey-predator in Biology, to opinion dynamics in social sciences. A fundamental yet challenging problem in various areas of science is to infer the interaction rule that yields collective motions of agents. We consider an inverse problem: given abundantly observed trajectories of the system, we are interested in estimating the interaction laws between the agents. We show that at least in the particular setting where the interaction is governed by an (unknown) function of pairwise distances, under a suitable coercivity condition that guarantees the well-posedness of the problem of recovering the interaction kernel, statistically and computationally efficient, nonparametric, suitably-regularized least-squares estimators exist. Our estimators achieve the optimal learning rate for one-dimensional (the variable being pairwise distance) regression problems with noisy observations. We demonstrate the performance of our estimators on various simple examples. In particular, our estimators produced accurate predictions of collective dynamics in relative large time intervals, even when they are learned from data collected in short time intervals.
February 3 (Cancelled due to Snow Storm)
Speaker: Katy Craig, Department of Mathematics, UCSB
Title: Aggregation diffusion to constrained interaction: minimizers and gradient flows in the slow diffusion limit
Abstract: Nonlocal interactions arise throughout the natural world, from collective dynamics in biological swarms to vortex motion in superconductors. Over the past fifteen years, there has been significant interest in aggregation diffusion equations, which consider the competing effects of nonlocal interactions and local diffusion. More recently, interest has also emerged in constrained aggregation equations, which consider the competition between nonlocal interactions and a hard height constraint on the density. In joint work with Ihsan Topaloglu, we prove that aggregation diffusion equations converge to the constrained aggregation equation in the slow diffusion limit. As an application of this theoretical result, we adapt Carrillo, Craig, and Patacchini's blob method for diffusion to develop a numerical method for constrained aggregation equations, which we use to explore open conjectures in geometric shape optimization.
February 12 (Wednesday). Note time is 4pm - 5pm and Room is LCB 222.
Speaker: Mikhail Zaslavsky, Schlumberger-Doll Research
Title: DATA-DRIVEN REDUCED ORDER MODELS AS DEEP PRIOR NEURAL NETWORKS IN INVERSE SCATTERING
Abstract: In this talk we discuss the framework of a data-driven model order reduction for solving inverse scattering problem when only a significantly limited training set (even composed of just a single data element) is available. We formulate our reduced order model (ROM) as a deep prior neural network with a specific physics-based architecture. It is inspired by 1950's works of M. Krein who developed mechanical interpretation of synthesized networks and constructed the data embedding into the state space. This allows networks to learn efficiently the underlying PDE system directly from the measured data, hence the data-driven designation. Unlike conventional optimization-based training of deep networks, our network hyperparameters can be computed directly from the data using existing powerful tools of data-driven ROMs . To show the advantages of using such networks, we consider time-domain acoustic and electromagnetic inverse problems in multiple-scattering environments. We focus on two such applications: 1) a transform that linearizes the dependency of the data on the medium therefore suppressing nonlinear artifacts, such as multiple reflections in acoustic and elastic full waveform data. It can be used as a preprocessing step in conventional linearized (Born) inversion algorithms. 2) inversion of high-frequency radar and low-frequency diffusive electromagnetic data preconditioned via ROM-based autoencoder. Numerical examples will be presented to verify the performance of the framework.
February 21 (Friday). Joint Applied Math/Stochastics/SCI
Seminar. Note Time is 3pm - 4pm and Room is LCB 225.
Speaker: Bao Wang, Department of Mathematics, UCLA
Title: Momentum and PDE Techniques for Deep Learning
Abstract: Accelerating stochastic gradient-based optimization and sampling algorithms are of crucial importance for modern machine learning, in particular, deep learning. Developing mathematically principled architectures for deep neural nets (DNNs) is another important direction to advance deep learning. In this talk, I will present some recent progress in leveraging Nesterov Accelerated Gradient (NAG) with scheduled restarting to accelerate SGD and to improve training DNNs. Moreover, I will talk about some recent results on PDE based techniques for accelerating stochastic gradient descent and Langevin dynamics. Finally, I will discuss a momentum-based formulation for DNN design.
February 27 (Thursday). Joint Applied Math/SCI
Seminar. Note Time is 3pm - 4pm and Room is LCB 215.
Speaker: Lise-Marie Imbert-Gerard, Department of Mathematics, University of Maryland, College Park
Title: Wave propagation in inhomogeneous media: An introduction to Generalized Plane Waves
Abstract: Trefftz methods rely, in broad terms, on the idea of approximating solutions to Partial Differential Equation (PDEs) using basis functions which are exact solutions of the PDE, making explicit use of information about the ambient medium. But wave propagation problems in inhomogeneous media is modeled by PDEs with variable coefficients, and in general no exact solutions are available. Generalized Plane Waves (GPWs) are functions that have been introduced, in the case of the Helmholtz equation with variable coefficients, to address this problem: they are not exact solutions to the PDE but are instead constructed locally as high order approximate solutions. We will discuss the origin, the construction, and the properties of GPWs. The construction process introduces a consistency error, requiring a specific analysis.
Speaker: Eric Stachura, Department of Mathematics, Kennesaw State University
Title: Refraction problems in negative refractive index materials
Abstract: In this talk, I will give an overview of some optical refraction problems that can be solved with PDE techniques. I will focus on materials which possess a negative refractive index. Along the way, we will see how a fully nonlinear PDE of Monge- Ampere type arises in such problems. Much of the work to be discussed is joint with Cristian Gutierrez.
March 16 (Cancelled!)
Speaker: Daniel Spirn, Department of Mathematics, University of Minnesota
March 23 (Cancelled!)
Speaker: Aaron Barrett, Department of Mathematics, University of Utah
April 6 (Cancelled!)
Speaker: Shilpa Khatri, Department of Applied Mathematics, University of California, Merced
epshteyn (at) math.utah.edu).
Past lectures: Fall 2019, Spring 2019, Fall 2018, Spring 2018, Fall 2017, Spring 2017, Fall 2016, Spring 2016, Fall 2015, Spring 2015, Fall 2014, Spring 2014, Fall 2013, Spring 2013, Fall 2012, Spring 2012, Fall 2011, Spring 2011, Fall 2010, Spring 2010, Fall 2009, Spring 2009, Fall 2008, Spring 2008, Fall 2007, Spring 2007, Fall 2006, Spring 2006, Fall 2005, Spring 2005, Fall 2004, Spring 2004, Fall 2003, Spring 2003, Fall 2002, Spring 2002, Fall 2001, Spring 2001, Fall 2000, Spring 2000, Fall 1999, Spring 1999, Fall 1998, Spring 1998, Winter 1998, Fall 1997, Spring 1997, Winter 1997, Fall 1996, Spring 1996, Winter 1996, Fall 1995.