CSE 535: Theory of Optimization and Continuous Algorithms

The design of algorithms is traditionally a discrete endeavor. However, many advances have come from a continuous viewpoint. Typically, a continuous process, deterministic or randomized is designed (or shown) to have desirable properties, such as approaching an optimal solution or a desired distribution, and an algorithm is derived from this by appropriate discretization. In interesting and general settings, the current fastest methods are a consequence of this perspective. We will discuss several examples of algorithms for high-dimensional optimization, and use them to understand the following concepts in detail.

  • Elimination
  • Reduction
  • Geometrization
  • Sparsification
  • Acceleration
  • Decomposition

Administrative Information:

  • Lectures: Tue, Thu 10:00-11:20 at Zoom
  • Text Book (in progress): https://www.dropbox.com/s/9ck2p07mibvwrmg/main.pdf.
  • Please do not make comment directly on the merged version, but instead the split versions below. Dropbox cannot handle large documents efficiently.
  • Prerequisite: basic knowledge of algorithms, probability, linear algebra.

Contacts:

  • Instructor: Yin Tat Lee
  • Office Hours: By appointment, email me at (yintat@ignoreme-uw.edu)
  • TA: Swati Padmanabhan (pswati@ignoreme-cs.washington.edu)
  • TA Office Hours: 1130 am - 1220 am Thu and 930 pm - 1020 pm Thu, zoom link on edstem.
  • Mailing List: Here
  • Edstem: Here

Assignments

  • Course evaluation: 5 assignments (100%) or 1 project (100%)
  • Assignments posted and submitted via Gradescope.

Class Policy

  • You may discuss assignments with others, but you must write down the solutions by yourself.
  • Assignments as well as regrade requests are to be submitted only via gradescope.
  • In general, late submissions are not accepted (the submission server closes at the deadline). Gradescope lets you overwrite previous submissions, so it is recommended to use this feature to avoid missing the submission deadline altogether.
  • We follow the standard UW CSE policy for academic integrity.
  • The office hours are for general questions about material from the lecture and homework problems.
  • Please refer to university policies regarding disability accommodations or religious accommodations.

Tentative Schedule:

Introduction

Basic Methods

  • Apr 06: Gradient Descent Sec 2 (Added Sec 2.6 on 7 Apr)
  • Apr 06: (HW 1 due)
  • Apr 08: Gradient Descent (cont) and Proximal Methods (See Sec 2 above)
  • Apr 13: Cutting Plane Methods Sec 3
  • Apr 15: Cutting Plane Methods (cont)

Geometrization

  • Apr 20: Mirror Descent (HW 2 due)
  • Apr 22: Newton Method & L-BFGS

Homotopy Method

  • Apr 27: Interior Point Method
  • Apr 29: Robust Interior Point Method

Sparsification

  • May 04: Leverage Score Sampling (HW 3 due or project proposal due)
  • May 06: Stochastic Gradient Descent & Variance Reduction

Combinatorial Methods

  • May 11: Cholesky Decomposition
  • May 13: Laplacian Solver

Acceleration

  • May 18: Conjugate Gradient & Chebyshev Expansion (HW 4 due)
  • May 20: Accelerated Gradient Descent

Student Requests

  • May 25: Adagrad
  • May 27: Bayesian methods

Project Presentations (Depending on Number of Projects)

  • Jun 01: Project (HW 5 due)
  • Jun 03: Project

  • Project report due on Jun 08

Other Lecture Notes