CS264: Beyond WorstCase Analysis
Instructor: Tim
Roughgarden (Office hours: Tuesdays 23 PM, Gates 474.)
Teaching Assistants:

Josh Wang
(Office hours: Wednesdays 35 PM, Gates 460.
Email: joshua.wang [at] cs [dot] stanford [dot] edu)
Time/location:
10:3011:50 AM on Tuesdays and Thursdays in Hewlett 102.
Piazza site:
here
Course description: In the worstcase analysis of algorithms,
the overall performance of an algorithm is summarized by its worst
performance on any input. This approach has countless success
stories, but there are also important computational problems  like
linear programming, clustering, and online caching  where the
worstcase analysis framework does not provide any helpful advice on
how to solve the problem. This course covers a number of modeling
methods for going beyond worstcase analysis and articulating which
inputs are the most relevant.
List of topics: instance optimality;
perturbation and approximate stability;
smoothed analysis;
parameterized analysis and condition numbers;
models of data (pseudorandomness,
locality, diffuse adversaries, etc.);
robust analogs of
averagecase analysis; resource augmentation; planted and
semirandom graph models; sparse recovery (like compressive sensing).
Motivating problems drawn from online
algorithms, machine learning (topic models, clustering),
computational geometry, graph
partitioning, scheduling, linear programming,
local search heuristics, social networks,
hashing,
signal processing,
and empirical algorithmics.
Prerequisites: undergraduate algorithms (CS161, or equivalent).
Prior exposure to linear programming is recommended but not required
(review materials will be posted as needed).
Course requirements:
Weekly exercise sets. Students have the option to substitute a
paperreading project for 3 of the exercise sets.
No late assignments accepted, although we will drop the lowest
of your 10 scores.
Previous offering (in 2014):
here. Includes lecture videos and lecture notes.
Note: the overlap between this and the previous offering will be roughly 50%.
Primer:
for a twohour overview of the entire course, see
 Beyond WorstCase Analysis,
Part 1 and
Part 2,
Algorithms and Uncertainty Boot Camp, Simons Institute (2016).
Slides
Lecture notes
Additional background material:
Homework:

Submission instructions:
We are using Gradescope for the homework submissions (you should create an account if you don't already have one).
Use the course code
MBY5R9 to register for CS264. One submission per team (but remember to add the names of all team members).
 A LaTeX template that you can use to
type up solutions. Here
and here are good
introductions to LaTeX.
 Homework #1 (Out Thu 1/12, due midnight Thu 1/19.)
 Homework #2 (Out Thu 1/19, due midnight Thu 1/26.)
 Homework #3 (Out Thu 1/26, due midnight Thu 2/2.)
 Homework #4 (Out Thu 2/2, due midnight Thu 2/9.)
 Homework #5 (Out Thu 2/9, due midnight Thu 2/16.)
 Homework #6 (Out Thu 2/16, due midnight Thu 2/23.)
 Homework #7 (Out Thu 2/23, due midnight Thu 3/2.)
 Homework #8 (Out Thu 3/2, due midnight Thu 3/9.)
 Homework #9 (Out Thu 3/9, due midnight Thu 3/16.)
 Homework #10 (Out Thu 3/16, due midnight Thu 3/23.)
Related workshops: Since this course debuted in 2009, there
have been several workshops on the topic:
Detailed schedule and references (under construction):
 Lecture 1 (Tue Jan 10): Introduction.
Three motivating examples (caching, linear programming, clustering).
Course philosophy. Instance optimality. Further reading:
 Lecture 2 (Thu Jan 12):
Instance optimality in computational geometry.
References:
 Lecture 3 (Tue Jan 17):
Online paging. Resource augmentation.
Loose competitiveness.
References:
 Lecture 4 (Thu Jan 19): Parameterized analysis of online paging.
Parameterizing page sequences by their locality.
Optimal page fault bounds for LRU.
Rigorously separating LRU and FIFO.
References:
Related video (from the 2011 workshop on BWCA):
 Lecture 5 (Tue Jan 24):
NPhard problems and what to do about them.
The knapsack problem and parameterized approximation bounds.
Parameterized algorithms and a polynomialsize kernel for the vertex
cover problem. Colorcoding and the longest path problem.
 Parameterized algorithms is a huge field: here is a recent textbook on the topic.
 Lecture 6 (Thu Jan 26):
Perturbation stability: clustering is hard only when it doesn't matter.
 Lecture 7 (Tue Jan 31):
When are linear programming relaxations exact?
Case study: perturbationstable instances of the minimum multiway cut problem.
References:
 Lecture 8 (Thu Feb 2):
Exact recovery in perturbationstable instances of the maximum cut problem. Metric embeddings and Bourgain's Theorem. Improvements via semidefinite programming. References:
 Lecture 9 (Tue Feb 7):
Random graphs. Planted models. A spectral algorithm for the planted bisection problem.
 Lecture 10 (Thu Feb 9):
A little matrix perturbation theory.
A spectral algorithm for the planted clique problem.
 Lecture 11 (Tue Feb 14):
Semirandom models and semidefinite programming 1.
Case study: minimum bisection.
SDP duality.
 Lecture 12 (Thu Feb 16):
Semirandom models and semidefinite programming 2.
Finish bisection.
Second case study: maximum clique.
 Lecture 13 (Tue Feb 21):
(Guest lecture by Mary Wootters.)
A taste of compressive sensing.
Finding sparse solutions to underdetermined linear systems.
When does l1minimization work?
References:
 Lecture 14 (Thu Feb 23):
(Guest lecture by Moses Charikar.)
LP and SDP relaxations for kmedian and kmeans. Exact recovery in the stochastic ball model.
Primary reference:
 Lecture 15 (Tue Feb 28):
Exact recovery: topic models and nonnegative matrix factorization.
 Lecture 16 (Thu Mar 2):
Random ordering problems. Secretary problems. Case study: online facility location.
 Lecture 17 (Tue Mar 7):
Smoothed analysis of local search heuristics. Case study: the 2OPT
local search heuristic for TSP in the plane.
Reference for lecture:
 Lecture 18 (Thu Mar 9):
For binary optimization problems,
polynomial smoothed complexity <=> (Las Vegas randomized)
pseudopolynomial worstcase complexity.
Reference for lecture:
 Lecture 19 (Tue Mar 14):
Selfimproving algorithms.
Primary reference:
 Lecture 20 (Thu Mar 16):
Applicationspecific algorithm selection.
Primary reference: