Continuous Algorithms

Santosh Vempala

Spring 2019. MW 9:30-10:45am.

The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is derived from this by appropriate discretization. In interesting and general settings, the current fastest methods are a consequence of this perspective. We will discuss several examples of algorithms for high-dimensional optimization and sampling, and use them to understand the following concepts in detail.

The course is also being taught on the West coast by Yin Tat Lee.


Prerequisite: basic knowledge of algorithms, probability, linear algebra.


Topics (tentative):

  1. Geometry I. Ellipsoid; Center-of-Gravity; Simulated Annealing (math: Distribution of mass)
  2. Homotopy (path following) I. Gradient Descent (math: Ordinary Differential Equations; fixed point theorem)
  3. Homotopy II. Stochastic Gradient Descent, Sampling by Diffusion (math: Stochastic Differential Equations; isoperimetry)
  4. Geometry II. Newton method; Interior Point; Hit-and-run/Dikin walk (math: local, non-Euclidean geometry)
  5. Geometry III. Central path; Hamiltonian Monte Carlo (math: basics of Riemannian manifolds)
  6. Duality/Reductions.
  7. Contraction. Iterative methods; "Conjugate" (math: Taylor expansion; polynomial approximation)
  8. Continuous vs discrete. Lattices, diophantine approximation, integer programming (math: geometry of numbers).
Format: This course is a research seminar. I will give most of the lectures and class participants will give the rest.
A student taking the class for credit is expected to (possibly partnering with another student):
(a) Scribe one lecture
(b) Present in class
(c) Study in more depth the topic of (a) or (b) and prepare an expository note and/or work on a related open problem.