Files
Abstract
Convex relaxations are a central tool in modern algorithm design, but mathematically analyzingthe performance of a convex relaxation has remained difficult.
We develop Fourier analytic methods for the study of convex relaxations,
with special focus on semidefinite programming and the sum-of-squares hierarchy.
The sum-of-squares hierarchy is a meta-algorithm for polynomial optimizationthat has led to recent breakthroughs in combinatorial optimization and robust statistics.
We are mostly concerned with lower bounds against sum-of-squares: when does it fail to solve a problem?
Barak et al~\cite{BHKKMP16:PlantedClique} pioneered the use of Fourier analysis to prove lower bounds
against sum-of-squares on average-case problems.
We significantly extend their methods, namely \emph{pseudocalibration} and \emph{graph matrix analysis},
helping us to understand when and how the sum-of-squares algorithm succeeds or fails.
We prove concrete lower bounds against sum-of-squares for two problems.First, we prove that sum-of-squares cannot distinguish between: (1) $m = n^{1.5-\eps}$ independently sampled
Gaussian vectors in $\R^n$, (2) there is a hidden vector $h$ such that the $m$ vectors are additionally
constrained to satisfy $\ip{v_i}{h}^2 = 1$.
Via a reduction, this implies a lower bound for certifying ground states of the Sherrington-Kirkpatrick model,
which is a central model in statistical physics. On a technical level, the main challenge
is to handle ``hard constraints'' in the problem; we introduce
several new techniques for studying sum-of-squares in the presence of polynomial equality constraints.
Second, we prove that sum-of-squares cannot certifythe size of the maximum independent set in a sparse random graph.
The main challenge here is sparsity; our work is the first to prove sum-of-squares lower bounds
in the sparse regime. We additionally show how to overcome
the failure of pseudocalibration in this setting by making a \emph{connected
truncation} that gets rid of ``global tests''.
Moving beyond the setting of the sum-of-squares hierarchy,we introduce a new basis of \emph{inner product polynomials}.
For a collection of random vectors $d_1, \dots, d_m \in \R^n$,
these are an orthogonal basis for functions that are orthogonally invariant,
meaning that the function only depends on the angles (inner products) between the $d_i$.
These polynomials have many beautiful combinatorial properties related to the topology of an underlying graph.
Finally, we investigate convex programming relaxations for a fundamental problem in the theoryof error-correcting codes, the largest possible rate of a code with linear distance.
The best upper bound on this quantity was obtained by constructing dual solutions
to Delsarte's linear program.
We generalize Delsarte's linear program to a hierarchy of linear programs (related to but simpler
than the sum-of-squares hierarchy).
The hierarchy is similar in structure to Delsarte's linear program, thus offering hope that
its value can be theoretically analyzed to prove new upper bounds,
although this remains ongoing work.