A tale of two streaming CSPs
Speaker: Noah SingerTitle: A tale of two streaming CSPs
Date: 17 Nov 2022 17:30-19:00 EST
Location: SEC Level 3 NW Terrace
Food: Chinese
I’ll discuss recent progress on understanding the space complexity of approximating constraint satisfaction problems (CSPs) in various streaming models through the lens of the two simplest CSPs, Max-CUT and Max-DICUT.
I’ll first walk through several prior results in the setting of o(sqrt n)-space algorithms with worst-case input ordering. First, for Max-CUT, I’ll show how natural streaming-to-communication reductions rule out all nontrivial approximations (Kapralov-Khanna-Sudan’15). On the other hand, for Max-DICUT, I’ll discuss a quantity called “total bias”, which can be measured using well-known “norm sketching” algorithms in O(log n) space, and in turn yields a provably optimal 4/9-approximation algorithm (Chou-Golovnev-Velusamy’20, with subsequent simplifications). This highlights the difference between Max-CUT’s inherent “symmetry”, which makes it amenable to strong hardness results, and Max-DICUT’s “asymmetry”, which leaves room for interesting algorithms.
Finally, we’ll think about extending hardness results beyond the above setting. I’ll argue that, in comparison to Max-CUT, the proof of the Max-DICUT lower bound is seemingly “brittle”, and I’ll sketch how our recent works confirm this by giving improved approximations for Max-DICUT when either (i) the input stream is randomly ordered or (ii) we’re given slightly more than o(sqrt n) space.
Based on joint works with Joanna Boyland, Michael Hwang, Tarun Prasad, Raghuvansh R. Saxena, Madhu Sudan, and Santhoshini Velusamy.