Department of Mathematics
Applied Mathematics Seminar, Spring 2022

Mondays 4:00 PM - 5:00 PM MT (unless otherwise noted), Hybrid format: In-Person in LCB 219 (unless otherwise noted), and Online (zoom information will be provided before the seminars)




January 4 (Tuesday). In-Person LCB 215
Speaker: Michael Lindsey, Courant Institute of Mathematical Sciences, New York University
Title: Optimization and variational methods for quantum many-body systems
Abstract: A fundamental quantity of interest in the study of a quantum many-body system is the ground-state energy, which is the lowest eigenvalue of an exponentially high-dimensional Hermitian operator. Some approximation frameworks for determining this energy are variational in that they furnish a guaranteed upper or lower bound on the energy. In this talk we discuss recent work on approaches that are variational in this sense. First we discuss new semidefinite relaxations for the ground-state problem, which yield variational lower bounds and can be interpreted via convex duality as quantum embedding methods. We also introduce a scalable solver for these relaxations based on the embedding interpretation. Second we discuss a new optimization approach in variational Monte Carlo (VMC), which yields variational upper bounds by direct optimization over a parametric class of wavefunctions. Of particular interest are recently introduced neural-network-based parametrizations. Time permitting, we also discuss a new optimization approach for computing excited states in VMC.

January 7 (Friday). In-Person, LCB 219
Speaker: Jingni Xiao, Department of Mathematics, Rutgers University
Title: The Anisotropic Calderón Problem for Nonlocal Operators
Abstract: The Calderón problem asks whether we can determine the conductivity of a medium by making voltage and current measurements at the boundary. The information is encoded in the Dirichlet-to-Neumann map that sends the Dirichlet data to the Neumann data of solutions of the conductivity equation. We consider in this talk a nonlocal analog of this problem. More precisely we consider a zeroth order perturbation of a fractional operator associated to a general second order elliptic equation and show that we can recover the perturbation from the corresponding Dirichlet-to-Neumann map. An example is the fractional Laplacian associated to an anisotropic conductivity.

January 11 (Tuesday). In-Person, LCB 219
Speaker: Junshan Lin, Department of Mathematics and Statistics, Auburn University
Title: A Mathematical Perspective on Resonant Diffraction and Super-resolution Imaging by Subwavelength Holes
Abstract: Since the discovery of the extraordinary optical transmission through the nanohole array by Thomas Ebbesen, a wealth of research has been sparked in the experimental and theoretical investigation of anomalous diffraction by subwavelength structures. In this talk, I will present rigorous mathematical theories for several anomalous diffractions in metallic gratings with subwavelength slit holes. The diffraction problem in three configurations will be investigated. Based upon the layer potential technique, Gohberg-Sigal type theory, asymptotic analysis and the homogenization theory, quantitative analysis of the diffraction anomaly in each configuration will be presented. I will also present a new imaging modality with illumination patterns generated from an array of resonant slit holes. At resonant frequencies, the patterned illuminations through the subwavelength holes encompass both low frequency and highly oscillatory waves, which allow for probing both the low and high spatial frequencies components of the imaging sample to achieve super-resolution. The imaging setup, the underlying mathematical framework and the computational results will be discussed.

January 14 (Friday), 2:30pm. In-Person, LCB 215
Speaker: Alice Nadeau, Department of Mathematics, Cornell University
Title: Rate-induced Tipping Points: Bifurcations and Heteroclinic Connections in Time
Abstract: Qualitatively, a tipping point in a dynamical system is when a small change in system inputs causes the system to move to a drastically different state. The discussion of tipping points in climate and related fields has become increasingly urgent as scientists are concerned that different aspects of Earth's climate could tip to a qualitatively different state without sufficient warning (e.g., loss of summer sea ice in the Arctic). Often a system that has tipped is difficult---or even impossible---to return back to its original state, making the study of predicting and preventing tipping phenomena extremely important. In this talk, I will briefly overview three main mathematical causes for tipping in dynamical systems as well as examples of each in different climate models. I will discuss the recent endeavors to put one of these causes--rate-induced tipping--on firm mathematical footing. I will show how viewing this type of tipping from a nonautonomous framework allows us to extend the types of systems we can analyze for tipping phenomena.

January 26 (Wednesday), 11am. Zoom
Speaker: Jody Reimer, Leeds University
Title: Stochasticity and uncertainty in ecological models
Abstract: It is far easier to come up with examples of apparent randomness in biology than of reliably deterministic processes. Seeds are dispersed by the wind, taking root only if they happen to land in favorable soil. Unpredictable combinations of temperature and precipitation create fluctuations in animal survival. Diffusion and collision of ice floes form the highly heterogeneous environment at the center of polar marine ecology. I will discuss several examples of the various ways that I have represented this uncertainty in mathematical models, including applications of the Central Limit Theorem for scaling from individuals to populations, Galton-Watson branching processes, stochastic differential equations, and uncertainty quantification approaches for parametric uncertainty. Embracing stochasticity and uncertainty is necessary if we are to fully exploit the power of mathematics for understanding the natural world by connecting models to data.

February 28. In-Person, LCB 219
Speaker: Kamala Liu, Department of Mathematics, University of Utah
Title: LOCAL WELL-POSEDNESS OF A NONLINEAR FOKKER-PLANCK MODEL
Abstract: In this talk I present a nonlinear Fokker-Plank model which emerges from our study of grain growth dynamics of polycrystalline materials. In particular, the model corresponds to inhomogeneous absolute temperature and obey a special energy law. The main part of this presentation is about our results of local existence and uniqueness of the solution. I'll also present some preliminary numerical simulations of this model, in particular its long-time asymptotic behaviors. This is joint work with Yekaterina Epshteyn, Chun Liu and Masashi Mizuno.

March 14. In-Person
Speaker: Wesley Hamilton, Department of Mathematics, University of Utah
Title: Some Spectral Partitioning Problems
Abstract: Spectral partitioning generally refers to clustering data using spectral tools, such as a graph Laplacian built on the data. In this talk I'll present a number of other data partitioning problems I work on that all have distinct spectral aspects, including some discrete-to-continuum shape optimization problems. I'll also present on recent work on computational redistricting that incorporate these tools.

March 21. Online
Speaker: Grey Ballard, Department of Computer Science, Wake Forest University
Title: Efficient Algorithms for Computing Tensor Decompositions
Abstract: Multidimensional data, coming from applications such as brain imaging, combustion simulation, and parameter-dependent PDEs, can often overwhelm the memory or computational resources of a single workstation. In this talk, I'll describe three tensor decompositions used to represent and approximate multidimensional data: CP, Tucker, and Tensor Train. Algorithms for computing these decompositions are based upon solving fundamental matrix problems such as linear least squares and the matrix SVD. I'll talk about how to solve these problems using parallel algorithms for the bottleneck computations. The algorithms are scalable, able to process terabyte-sized tensors and maintain high computational efficiency for 100s to 1000s of processing nodes. The open-source software libraries we have developed are designed for clusters of computers and have been employed on various supercomputers.

April 4. In-Person
Speaker: Emily Evans, Department of Mathematics, Brigham Young University
Title: Force-based models of cell-extracellular interaction
Abstract: To predict, alter and control wound healing and pathological conditions biologists need a better understanding of cell-cell and cell-extracellular interactions. We model these interactions as a system of stochastic differential equations representing the location of cell centers and extra-cellular matrix attachment sites. Numerical simulations and analysis show that when the duration of attachment to the extracellular matrix is a memoryless and force independent random process, the speed of the cell is independent of the force these attachment sites exert on the cell. Furthermore, understanding the dynamics of the attachment and detachment to the extra-cellular matrix is key to predicting cell speed. To better understand the relationship between attachment dynamics and cell speed, we consider the problem in the context of two related (but simpler) models of cell motion. In this talk, we will also show the full results showing that expected average cell speed is independent of force and dependent on attachment dynamics.

April 11. In-Person
Speaker: Mahdi Soltanolkotabi, Departments of Electrical and Computer Engineering and Computer Science, University of Southern California
Title: Towards Stronger Foundations for AI and its Applications to the Sciences
Abstract: Despite wide empirical success, many of the most commonly used learning approaches lack a clear mathematical foundation and often rely on poorly understood heuristics. Even when theoretical guarantees do exist they are often too crude and/or pessimistic to explain their success in practical regimes of operation or serve as a guiding principle for practitioners. Furthermore, in many scenarios such as those arising in scientific applications they require significant resources (compute, data, etc.) to work reliably.

The first part of the talk takes a step towards building a stronger theoretical foundation for such nonconvex learning. In particular, I will focus on demystifying the generalization and feature learning capability of modern overparameterized learning where the parameters of the learning model (e.g. neural network) exceed the size of the training data. Our result is based on an intriguing spectral bias phenomena for gradient descent, that puts the iterations on a particular trajectory towards solutions that are not only globally optimal but also generalize well. Notably this analysis overcomes a major theoretical bottleneck in the existing literature and goes beyond the "lazy" training regime which requires unrealistic hyperparameter choices (e.g. very small step sizes, large initialization or wide models). In the second part of the talk I will discuss the challenges and opportunities of using AI for scientific applications and medical image reconstruction in particular. I will discuss our work on designing new architectures that lead to state of the art performance and report on techniques to significantly reduce the required data for training.

April 18. In-Person
Speaker: Mark Allen, Department of Mathematics, BYU
Title: Sharp quantitative Faber-Krahn inequalities
Abstract: We present new sharp quantitative Faber-Krahn inequalities. We use a notion of distance that accounts for sets with zero measure but with positive capacity. By employing the selection principle, this new notion of distance leads to a critical perturbation of the classical Alt-Caffarelli functional rather than a lower-order perturbation. By utilizing our Faber-Krahn inequality on the sphere, we establish a quantitative form of the Alt-Caffarelli-Monotonicity formula. This is joint work with Dennis Kriventsov and Robin Neumayer.

April 25. In-Person
Speaker: N. Benjamin Erichson, Department of Mechanical Engineering and Materials Science, University of Pittsburgh
Title: Continuous Networks for Sequential Predictions
Abstract: Deep learning is playing a growing role in many areas of science and engineering for modeling time series. However, deep neural networks are known to be sensitive to various adversarial environments, and thus out of the box models are often not suitable for mission critical applications. Hence, robustness and trustworthiness are increasingly important aspects in the process of engineering new neural network architectures and models. In this talk, I am going to view neural networks for time series prediction through the lens of dynamical systems. First, I will discuss novel continuous-time recurrent neural networks that are more robust and accurate than other traditional recurrent units. I will show that leveraging classical numerical methods, such as the higher-order explicit midpoint time integrator, improves the predictive accuracy of continuous-time recurrent units as compared to using the simpler one-step forward Euler scheme. Then, I will discuss a connection between recurrent neural networks and stochastic differential equations, and extensions such as multiscale ordinary differential equations for learning long-term sequential dependencies.


Seminar organizers: Farhan Abedin (abedinf (at) math.utah.edu), Will Feldman (feldman (at) math.utah.edu), and Akil Narayan (akil (at) sci.utah.edu).

Past lectures: Fall 2021, Spring 2021, Fall 2020, Spring 2020, Fall 2019, Spring 2019, Fall 2018, Spring 2018, Fall 2017, Spring 2017, Fall 2016, Spring 2016, Fall 2015, Spring 2015, Fall 2014, Spring 2014, Fall 2013, Spring 2013, Fall 2012, Spring 2012, Fall 2011, Spring 2011, Fall 2010, Spring 2010, Fall 2009, Spring 2009, Fall 2008, Spring 2008, Fall 2007, Spring 2007, Fall 2006, Spring 2006, Fall 2005, Spring 2005, Fall 2004, Spring 2004, Fall 2003, Spring 2003, Fall 2002, Spring 2002, Fall 2001, Spring 2001, Fall 2000, Spring 2000, Fall 1999, Spring 1999, Fall 1998, Spring 1998, Winter 1998, Fall 1997, Spring 1997, Winter 1997, Fall 1996, Spring 1996, Winter 1996, Fall 1995.


home   site index   webmaster   disclaimer   college of science   university of utah
155 South 1400 East, Room 233, Salt Lake City, UT 84112-0090, T:+1 801 581 6851, F:+1 801 581 4148