Continuous Algorithms: Optimization and Sampling

Spring 2020. TR: 9:30-10:45. Arch.(East) 207.

Santosh Vempala, OH: Tue 11-12, Klaus 2222.
TA: He Jia, OH: Thu 2-4, Arthita Ghosh: OH: Mon 11-1.

The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is derived from this by appropriate discretization. We will use this viewpoint to develop several general techniques and build up to the state-of-the-art in polynomial algorithms for optimization and sampling.

The course is offered simultaneously at UW by Yin Tat Lee.

Prerequisite: Basic knowledge of algorithms, probability, linear algebra.


Textbook (updated May 2023) (in progress)


Grading:
6550: Biweekly HW, including one longer HW towards the end. You are encouraged to collaborate on HW, but must write your own solutions. Submit via gradescope (link on canvas).
Send comments on lecture notes each week, either in dropbox or by email.
8803: HW optional. Comment on lecture notes each week.
Bonus: suggesting simpler proofs, exercises, figures.


Schedule (tentative):
  1. Introduction
    Jan 7. Course overview.
    Jan 9. Gradient descent. Notes.
    Jan 14. Gradient descent (contd.) Notes.
  2. Elimination
    Jan 16. Cutting Plane method; Ellipsoid. Notes (and from a while ago)
    Jan 21. Center-of-Gravity. Notes
    Jan 23. Ball method; lower bounds. Notes
  3. Reduction
    Jan 28. Duality (LP duality, SDP duality, ...). Notes
    Jan 30, 4. Equivalences (Optimization, Membership, Separation; Gradient, Evaluation). Notes
  4. Geometrization I
    Feb 4. Mirror Descent. Notes
    Feb 6. Frank-Wolfe. Notes
    Feb 11. Newton method
    Feb 13, 18. Interior Point Method for LP.Notes
    Feb 20, 25. Self-concordance and IPM for convex optimization.
  5. Sparsification
    Mar 3. Regression and subspace embeddings.Notes
    Mar 5. Matrix approximation.
  6. Acceleration
    Mar 10. Linear Regression and Chebychev Polynomials. Notes
    Mar 12. Conjugate Gradient. Notes
    Mar 17,19 (Spring break)
  7. Sampling
    Mar 24. Langevin Dynamics and SDE.
    Mar 26. Conductance and Convergence
  8. Geometrization II
    Apr 7. Mixing of the ball walk, Isotropy.
    Apr 9. Isoperimetry, KLS
    Apr 14. Volume Computation
    Apr 16. Gaussian Cooling
    Additional topics: (Riemannian) HMC, ODE, Complex analysis.