**Instructor:** Tim
Roughgarden (Office hours: by appointment, Gates 474.)

**Time/location:**
10AM--Noon on Thursdays in Gates 100.

**Course description:**
The best algorithm designers can prove both
possibility and impossibility results --- both upper and lower
bounds. For example, every serious computer scientist knows a
collection of canonical NP-complete problems and how to reduce them
to other problems of interest.
Communication complexity offers a clean theory that is extremely
useful for proving lower bounds for lots of different fundamental
problems. The two biggest goals of the course are:

1. Learn several canonical problems that have proved the most useful
for proving lower bounds (Disjointness, Index, Gap-Hamming, etc.).

2. Learn how to reduce lower bounds for fundamental algorithmic
problems to communication complexity lower bounds.

Along the way, we'll also:

3. Get exposure to lots of cool computational models and some famous
results about them --- data streams and linear sketches, compressive
sensing, space-query time trade-offs in data structures,
sublinear-time algorithms, and the extension complexity of linear
programs.

4. Scratch the surface of techniques for proving communication
complexity lower bounds (fooling sets, discrepancy bounds, corruption
bounds).

**Prerequisites**: undergraduate algorithms (CS161, or equivalent) and
mathematical maturity.

**Course requirements:**
Occasional written exercises. Students taking
the course for a letter grade should also give a 20-minute
presentation on a research paper related to the course.

**Lecture notes**

- Lecture 1 (Streaming: Upper and Lower Bounds) [beta version]
- Lecture 2 (Lower Bounds for 1-Way Communication Complexity) [beta version]
- Lecture 3 (Lower Bounds for Compressive Sensing) [beta version]
- Lecture 4 (Communication Complexity Boot Camp) [beta version]
- Lecture 5 (Extended Formulations) [alpha version]
- Lecture 6 (Data Structures) [beta version]
- Lecture 7 (Algorithmic Game Theory) [beta version]
- Lecture 8 (Property Testing) [beta version]

- Notes on large deviation inequalities (Markov, Chebyshev, and Chernoff-Hoeffding): see Avrim Blum's notes

**Exercises:** We'll be continually adding to the
list here. (Most recent additions on 3/17/15.)

**Detailed schedule and references:**

**Lecture 1:**The streaming model of computation. The AMS algorithm for F_2 estimation. 1-way communication complexity. Some reductions from Disjointness. Supplementary references:- Alon/Matias/Szegedy, The space complexity of approximating the frequency moments, STOC '96/JCSS '99.
- Lecture notes by Amit Chakrabarti.

**Lecture 2:**Lower Bounds on 1-Way Randomized Communication Complexity. Disjointness, Index, and Gap-Hamming. Supplementary references:- Kremer/Nisan/Ron, On Randomized One-Round Communication Complexity, STOC '95/Computational Complexity '99.
- Woodruff, Optimal space lower bounds for all frequency moments, SODA '04.
- Jayram/Kumar/Sivakumar, The One-Way Communication Complexity of Hamming Distance, Theory of Computing 2008.

**Lecture 3:**Lower Bounds for Compressive Sensing. (Appetizer: Randomized communication complexity of Equality.) References:- Do Ba/Indyk/Price/Woodruff, Lower Bounds for Sparse Recovery, SODA '11.
- CS264 lecture notes on compressive sensing.

**Lecture 4:**Boot camp on communication complexity. Deterministic and randomized protocols. Monochromatic rectangles, fooling sets, and covering lower bounds. Distributional complexity and the corruption method. References:- Most of this material dates back to Yao, Some complexity questions related to distributive computing, STOC '79.
- The canonical reference is the book by
Kushilevitz and Nisan,
*Communication Complexity*. See especially Sections 1.1-1.3, 2.1, and 3.1-3.4.

**Lecture 5:**Extended formulations. Covers and nondeterministic communication complexity. The Face-Vertex problem. Yannakakis's Theorem: nonnegative rank of the slack matrix characterizes extension complexity. Lower bound on the extension complexity of the correlation polytope. References:- Yannakakis, Expressing combinatorial optimization problems by Linear Programs, JCSS '91.
- Fiorini/Massar/Pokutta/Tiwary/de Wolf. Exponential Lower Bounds for Polytopes in Combinatorial Optimization, STOC '12.
- Kaibel/Weltge, A Short Proof that the Extension Complexity of the Correlation Polytope Grows Exponentially, Discrete and Computational Geometry (to appear).

**Lecture 6:**The approximate nearest neighbors problem. From communication protocols to dimension reduction in the Hamming cube. Asymmetric communication complexity. The richness method. Data structure lower bounds. Optimality of KOR dimensionality reduction.- Kushilevitz/Ostrovsky/Rabani, Efficient search for approximate nearest neighbor in high dimensional spaces, STOC '98.
- Miltersen/Nisan/Safra/Wigderson, On Data Structures and Asymmetric Communication Complexity, STOC '95.
- Andoni/Indyk/Patrascu, On the Optimality of the Dimensionality Reduction Method, FOCS '06.

**Lecture 7:**Lower bounds in algorithmic game theory. The welfare maximization problem. Multi-party communication complexity (number in hand). Lower bounds for multi-party disjointness. Lower bounds for equilibria.- Further background: CS364A (especially Lectures 2 and 7 (Section 3)) and CS364B (especially Lectures 14-16).
- N. Nisan, The Communication Complexity of Approximate Set Packing and Covering , ICALP 2002.
- Dobzinski/Nisan/Schapira, Approximation Algorithms for Combinatorial Auctions with Complement-Free Bidders, Math of OR 2010.
- Roughgarden, Barriers to Near-Optimal Equilibria, FOCS '14.

**Lecture 8:**Lower bounds in property testing. The edge tester for monotonicty. The case of a large range. Lower bounds from Disjointness.- Goldreich/Goldwasser/Lehman/Ron/Samorodnitsky, Testing Monotonicity, FOCS '98.
- Dodis/Goldreich/Lehman/Raskhodnikova/Ron/Samorodnitsky, Improved Testing Algorithms for Monotonicity, RANDOM '99.
- Blais/Brody/Matulef, Property Testing Lower Bounds via Communication Complexity, CCC '11.

**Related courses:**