Faster projection-free convex optimization over the spectrahedron

Research output: Contribution to journalConference articlepeer-review

Abstract

Minimizing a convex function over the spectrahedron, i.e., the set of all d × d positive semidefinite matrices with unit trace, is an important optimization task with many applications in optimization, machine learning, and signal processing. It is also notoriously difficult to solve in large-scale since standard techniques require to compute expensive matrix decompositions. An alternative is the conditional gradient method (aka Frank-Wolfe algorithm) that regained much interest in recent years, mostly due to its application to this specific setting. The key benefit of the CG method is that it avoids expensive matrix decompositions all together, and simply requires a single eigenvector computation per iteration, which is much more efficient. On the downside, the CG method, in general, converges with an inferior rate. The error for minimizing a β-smooth function after t iterations scales like β/t. This rate does not improve even if the function is also strongly convex. In this work we present a modification of the CG method tailored for the spectrahedron. The per-iteration complexity of the method is essentially identical to that of the standard CG method: only a single eigenvector computation is required. For minimizing an α-strongly convex and β-smooth function, the expected error of the method after t iterations is: (EQUATION PRESENTED) where rank(X), λmin(X) are the rank of the optimal solution and smallest nonzero eigenvalue, respectively. Beyond the significant improvement in convergence rate, it also follows that when the optimum is low-rank, our method provides better accuracy-rank tradeoff than the standard CG method. To the best of our knowledge, this is the first result that attains provably faster convergence rates for a CG variant for optimization over the spectrahedron. We also present encouraging preliminary empirical results.

Original languageEnglish
Pages (from-to)874-882
Number of pages9
JournalAdvances in Neural Information Processing Systems
StatePublished - 2016
Externally publishedYes
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: 5 Dec 201610 Dec 2016

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Faster projection-free convex optimization over the spectrahedron'. Together they form a unique fingerprint.

Cite this