## Best Papers

- Yin Tat Lee, Aaron Sidford.

Path Finding Methods for Linear Programming: Solving Linear Programs in Õ(vrank) Iterations and Faster Algorithms for Maximum Flow. FOCS 2014.

**First theoretic improvement on the running time of linear programming since 1986.**

**Best Paper and Best Student Paper in FOCS 2014. Selected as a notable article in computing in 2014 by Computing Reviews.**

(Part 1) (Part 2) - Jonathan A. Kelner, Yin Tat Lee, Lorenzo Orecchia, Aaron Sidford.

An Almost-Linear-Time Algorithm for Approximate Max Flow in Undirected Graphs, and its Multicommodity Generalizations. SODA 2014.

**First almost-linear-time method for undirected maxflow. (Improved to nearly-linear time!)**

Best Paper in SODA 2014. - Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong.

A Faster Cutting Plane Method and its Implications for Combinatorial and Convex Optimization. FOCS 2015.

**First nearly-cubic-time method for convex problems with first-order information.**

Best Student Paper in FOCS 2015.

## Selected Publications

- Yin Tat Lee, Santosh S. Vempala

Eldan’s Stochastic Localization and the KLS Hyperplane Conjecture: An Improved Lower Bound for Expansion. FOCS 2017.

**Proving that any 1-Lipschitz function on isotropic logconcave distribution concentrated to its median with n^{1/4} error.**(refined!)

(short version)

- Sébastien Bubeck, Ronen Eldan, Yin Tat Lee

Kernel-based methods for bandit convex optimization. STOC 2017.

**First polynomial-time method with poly(n) sqrt(T)-regret for bandit convex optimization.** - Yin Tat Lee, Santosh S. Vempala

Convergence Rate of Riemannian Hamiltonian Monte Carlo and Faster Polytope Volume Computation. STOC 2018.

**First(?) efficient method for sampling in polytopes.** - Sebastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee, Aleksander Madry

k-server via multiscale entropic regularization. STOC 2018.

**First o(k)-competitive algorithm for k server on HST.** - Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Shayan Oveis Gharan, Luca Trevisan.

Improved Cheeger’s inequality: analysis of spectral partitioning algorithms through higher order spectral gap. STOC 2013.

**Proving that Cheeger’s inequality is tight for graphs with spectral gaps.**

## Other Publications

- Yin Tat Lee, Santosh S. Vempala

Stochastic Localization + Stieltjes Barrier = Tight Bound for Log-Sobolev. STOC 2018.

**A tight estimate of the log-Sobolev constant for isotropic logconcave distributions.** - Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Akshay Ramachandran

The Paulsen Problem, Continuous Operator Scaling, and Smoothed Analysis. STOC 2018.

**Resolving the Paulsen problem, a major open problem in frame theory.** - Sébastien Bubeck, Michael B. Cohen, Yin Tat Lee, Yuanzhi Li

An homotopy method for ℓp regression provably beyond self-concordance and in input-sparsity time. STOC 2018.

**First input-sparsity time algorithm for l_p regression for 1<p<+inf.** - Ankit Garg, Yin Tat Lee, Zhao Song, Nikhil Srivastava

A Matrix Expander Chernoff Bound. STOC 2018. (Unusually many STOC 2018 papers here.) - Naman Agarwal, Sham Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford

Leverage Score Sampling for Faster Accelerated Regression and ERM. 2017. - Yin Tat Lee, Santosh S. Vempala

Geodesic Walks on Polytopes. STOC 2017.

**First subquadratic-iterations method for sampling in polytopes.** - Yin Tat Lee, He Sun.

An SDP-Based Algorithm for Linear-Sized Spectral Sparsification. STOC 2017.

**First nearly-linear-time method for linear-sized spectral sparsification.** - Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong

Subquadratic Submodular Function Minimization. STOC 2017. - Kevin Scaman, Francis Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié

Optimal algorithms for smooth and strongly convex distributed optimization in networks. 2017. - Sébastien Bubeck, Yin Tat Lee.

Black-box optimization with a politician. ICML 2016. - Michael B. Cohen, Yin Tat Lee, Gary L. Miller, Jakub W. Pachocki, Aaron Sidford.

Geometric Median in Nearly Linear Time. STOC 2016.

**First example problems that interior point method provably takes nearly linear time.** - Rasmus Kyng, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Daniel A. Spielman

Sparsified Cholesky and Multigrid Solvers for Connection Laplacians. STOC 2016. (A very simple sequential version!)

**A variant Multigrid and Cholesky Decomposition that runs in nearly linear time for any SDD matrices.**

- Zeyuan Allen Zhu, Yin Tat Lee, Lorenzo Orecchia.

Using Optimization to Obtain a Width-Independent, Parallel, Simpler, and Faster Positive SDP Solver. SODA 2016. - Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee.

Improved Cheeger’s Inequality and Analysis of Local Graph Partitioning using Vertex Expansion and Expansion Profile. SODA 2016. - Yin Tat Lee, He Sun.

Constructing Linear-Sized Spectral Sparsification in Almost-Linear Time. FOCS 2015. (Improved to nearly-linear-time!) - Sébastien Bubeck, Yin Tat Lee, Mohit Singh.

A geometric alternative to Nesterov’s accelerated gradient descent. 2015.

**A variant of Ellipsoid method that works really well.**

- Yin Tat Lee, Aaron Sidford.

Efficient Inverse Maintenance and Faster Algorithms for Linear Programming. FOCS 2015.

**Each iteration of interior point method takes O(nnz(A)+rank^2).** - Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, Aaron Sidford.

Uniform Sampling for Matrix Approximation. ITCS 2015.

**Solving a system of a tall matrix is as easy as a square matrix.** - Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford.

Single Pass Spectral Sparsification in Dynamic Streams. FOCS 2014. - Yin Tat Lee, Aaron Sidford.

Efficient Accelerated Coordinate Descent Methods and Faster Algorithms for Solving Linear Systems. FOCS 2013. (Generalized to Proximal and Parallel settings!) - Yin Tat Lee, Satish Rao, Nikhil Srivastavav.

A new approach to computing maximum flows using electrical flows. STOC 2013. (Improved to almost-linear-time!)

## Thesis

- Faster Algorithms for Convex and Combinatorial Optimization. 2016.

Advised by Professor Jonathan A. Kelner and received my PhD in Mathematics at MIT.

**Sprowls Award from MIT.**