Schedule: Algorithms, Geometry and Learning

Summer Quarter 2016, Stanford University

The reading group is over for this quarter. Thank you all for participating.

Summer Quarter 2016

The reading group will take place in Gates 463 on Tuesdays between 1:30pm - 3:30pm.

Tuesday 6/28: Paris Syminelakis on Metric Embeddings and Stochastic Decompositions.

We will start by giving a short introduction to the field of metric embeddings mentioning key questions and results. We then will give the proof of two basic results: Bourgain's Theorem, and (refinements of) Bartal's Probabilistic Tree Approximations. The techniques that we will introduce are Frechet embeddings and Padded Decompositions. Depending on time we might also present the proof of Assouad's Theorem.

  • Chapter 4 of Jiri Matousek's lecture notes on Metric Embeddings.

  • Lectures 1, 2, and 3 by Harald R"acke.

Tuesday 7/05: Moses Charikar on Approximation Algorithms using Tree Embeddings

There are many examples of problems of interest defined on metric spaces: buy-at-bulk network design, Dial-a-ride, min-sum k-clustering, k-server problem to name just a few. For these problems often the best known algorithms proceed in two steps: (i) a tree approximation of the metric space is constructed, (ii) the problem is solved on the tree (near) optimally and the solution is mapped back to the original space. In this talk, we review some of the applications and show how the padded decomposition can be used to construct a tree-approximation to the metric space.

Tuesday 7/12: Michael Kim on epsilon-nets for finite metric spaces

We will survey various notions of epsilon-nets and their use in approximation algorithms. Epsilon-nets refer to a variety of mathematical objects that aim to approximate a set of points – either probabilistically or geometrically – for the sake of analysis or algorithmic efficiency. We will discuss some basic constructions of epsilon-nets and then discuss their applications to problems like clustering and nearest-neighbor search. The session will be technical, but should be self-contained.

Tuesday 7/19: Ofir Geri on Metric Embeddings through Measured Descent

A basic idea to obtain metric embeddings is to (i) pick a distance scale and construct coordinates that ‘‘witness" distances of that scale, and (ii) do this for all scales. The issue with this approach is that one needs dimension that depends logarithmically on the ascpect ratio and as such our bounds on the distortion of the embedding deteriorate. In this talk, we will present a technique (Measured Descent) that overcomes these hurdles by constructing coordinates that witness distances at multiple scales simultaneously. Using this technique we show how to embed any metric space into Lp with optimal distortion (in worst case).

Tuesday 7/26: Nima Anari on Euclidean Distortion of Negative Type Metrics

We will survey the progress on the Sparset Cut problem and its deep connections to metric embeddings. We will first see how sparsest cut can be formulated as an L1 embedding problem; we warm up by going over the result of Leighton-Rao, an O(log n)-approximation for sparsest cut, and prove it by a simple reduction to Bourgain’s theorem. We then discuss the more strict relaxation of Arora-Rao-Vazirani and their structural result about metrics of negative type. We then discuss Generalized Sparsest Cut and sketch Arora-Lee-Naor’s theorem about embedding metrics of negative type into L2, which also implies embeddings into L1. At a very high level, ALN use an improved structural result about metrics of negative type, and combine this with gluing techniques from measured descent.

Tuesday 8/02: Joshua Alman on Locality Sensitive Filtering

Locality-Sensitive Hashing (LSH) was introduced in a classical work by Indyk and Motwani as a general framework for solving Nearest Neighbor Search (NNS) problems. Recently, generalizations of LSH have been introduced which are better suited to solving NNS in certain spaces. After discussing LSH, we will survey two such frameworks: Asymmetric LSH, with applications to Maximum Inner Product Search and Maximum Containment Search, and Locality-Sensitive Filtering, with applications to NNS on the sphere. The session should be self-contained.

Tuesday 8/09: Paris Syminelakis on Advances in Metric Embeddings

Traditional notions of metric embeddings concern the maximum distortion incurred on distances. In this talk I will present techniques that allow us to obtain tighter quantiative control of the distortion, leading to embeddings with constant average distortion and logarithmic maximum distortion (both optimal). The talk will be self contained. In the first part I will give a brief overview of the techniques we have seen so far and their shortcomings. This will provide intuition about the new method I will describe in the second part of the talk. The talk is based on the paper Advances in Metric Embedding theory by Abraham, Bartal, Neiman published in Advances in Mathematics 2011.

Tuesday 8/16: Weihao Kong on Isometrich Sketching of any set via the Restricted Isoperimetry Property

In this talk, we will show that a matrix which only achieves low embedding distortion for sparse vectors can be transformed into one that works for any set by simply multiplying each of its column by a random sign. It is previously known that after such “random sign” transformation, the embedding obeys the JL lemma. Based on that, we apply a generic chaining argument to arrive at the proof for an arbitrary (and potentially continuous) set.

Participants

Professors

Moses Charikar, Jacob Fox, Lester Mackey, Greg Valiant, Ryan Williams, Lexing Ying.

Students

Joshua Alman, Amir Ahmadinejad, Nima Anari, Vivek Bagaria, Yair Carmon, Ofir Geri, Jackson Gorham, Albert Gu, Neha Gupta, Arun Jambulapati, Carolyn Kim, Michael Kim, Weihao Kong, Andrea Lincoln, Rahul Makhijain, Bryan McCann, Yan Michalevsky, Victor Minden, Steve Mussmann, Lan H. Nguyen, Ashwin Paranjape, Vatsal Sharan, Nolan Skochdopole, Sahaana Suri, Paris Syminelakis, Pratiksha Thaker, James Thomson, Carrie Wu, Jiyan Yang, Anthoy L. Zhang, Hongyang Zhang, Marinka Zitnik.

Similar Seminars in the past.