Spring 2019. MW 9:30-10:45am.
The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a
continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have
desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is
derived from this by appropriate discretization. In interesting and general settings, the current fastest
methods are a consequence of this perspective. We will discuss several examples of algorithms for
high-dimensional optimization and sampling, and use them to understand the following concepts in detail.
The course is also being taught on the West coast by Yin Tat Lee
Prerequisite: basic knowledge of algorithms, probability, linear algebra.
- Geometry I. Ellipsoid; Center-of-Gravity; Simulated Annealing (math: Distribution of mass)
- Homotopy (path following) I. Gradient Descent (math: Ordinary Differential Equations; fixed point theorem)
- Homotopy II. Stochastic Gradient Descent, Sampling by Diffusion (math: Stochastic Differential Equations; isoperimetry)
- Geometry II. Newton method; Interior Point; Hit-and-run/Dikin walk (math: local, non-Euclidean geometry)
- Geometry III. Central path; Hamiltonian Monte Carlo (math: basics of Riemannian manifolds)
- Contraction. Iterative methods; "Conjugate" (math: Taylor expansion; polynomial approximation)
- Continuous vs discrete. Lattices, diophantine approximation, integer programming (math: geometry of numbers).
: This course is a research seminar. I will give most of the lectures and class participants will give the
A student taking the class for credit is expected to (possibly partnering with another student):
(a) Scribe one lecture
(b) Present in class
(c) Study in more depth the topic of (a) or (b) and prepare an expository note and/or work on a related open problem.