The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is derived from this by appropriate discretization. In interesting and general settings, the current fastest methods are a consequence of this perspective. We will discuss several examples of algorithms for high-dimensional optimization and sampling, and use them to understand the following concepts in detail.

- Reduction
- Elimination
- Conditioning
- Geometrization
- Expansion
- Sparsification
- Acceleration
- Decomposition
- Discretization

**Prerequisite**: basic knowledge of algorithms, probability, linear algebra.

**Schedule** (tentative):

- Introduction

Jan 7. Course overview. Gradient descent

Jan 9. Langevin dynamics - Elimination

Jan 14. Ellipsoid/CoG/Cutting Plane for Optimization

Jan 16. ... for Volume computation

Jan 21. MLK holiday

Jan 23. Bolas y Parabolas - Reduction

Jan 28, 30. Duality (LP duality, SDP duality, ...) and Equivalences (Optimization, Membership, Separation; OPT, Integration->Sampling; Maxflow, Bipartite Matching) - Geometrization I: Optimization

Feb 4. Newton method

Feb 6. IPM for LP

Feb 18. Andre Wibisono: Higher-Order Gradient Methods

- Geometrization II: Sampling

Feb 11. ARC 12.

Feb 13. Conductance, Ball walk

Feb 25. Mixing of ball walk, Isoperimetry, Isotropy.

Feb 27. Hit-and-Run

Mar 4. Dikin walk

Mar 6. (Riemannian) HMC

- Multiplication

Mar 11. Simulated Annealing.

Mar 13. Volume, Integration, Rounding.

Mar 18,20 (Spring break)

- Expansion

Mar 25. ODE. Collocation Method.

Mar 27. SDE. Ito's lemma. Localization.

- Sparsification

Apr 1. Regression and subspace embeddings.

- Acceleration

Apr 3. Chebychev polynomials. Accelerated Gradient Descent.

- Student presentations

Apr 8. Shengding (Representing the PSD cone). Cyrus+Majid (Escaping Saddle Points).

Apr 10. Sam (Hard Sphere Model), Zongchen (Convergence rate of Hamiltonian Monte Carlo).

Apr 15. Aditi+He (Star-convex Functions), Rohan (Applications of HMC).

Apr 17. Jiaming (Nonconvex optimization with SGD), Gabriel+James (Stochastic Billiards).

A student taking the class for credit is expected to (possibly partnering with another student):

(a) Each week: Proofread lecture notes and suggest improvements OR do one HW problem.

(b) Present in class

(c) Study in more depth the topic of (a) or (b) and prepare an expository note and/or work on a related open problem.