On the Efficient Implementation of the Matrix Exponentiated Gradient Algorithm for Low-Rank Matrix Optimization

Dan Garber, Atara Kaplan

Research output: Contribution to journalArticlepeer-review


Convex optimization over the spectrahedron, that is, the set of all real n × n positive semidefinite matrices with unit trace, has important applications in machine learning, signal processing, and statistics, mainly as a convex relaxation for optimization problems with low-rank matrices. It is also one of the most prominent examples in the theory of first order methods for convex optimization in which non-Euclidean methods can be significantly preferable to their Euclidean counterparts. In particular, the desirable choice is the matrix exponentiated gradient (MEG) method, which is based on the Bregman distance induced by the (negative) von Neumann entropy. Unfortunately, implementing MEG requires a full singular value decomposition (SVD) computation on each iteration, which is not scalable to high-dimensional problems. In this work, we propose efficient implementations of MEG, with both deterministic and stochastic gradients, which are tailored for optimization with low-rank matrices, and only use a single low-rank SVD computation on each iteration. We also provide efficiently computable certificates for the correct convergence of our methods. Mainly, we prove that, under a strict complementarity condition, the suggested methods converge from a warm-start initialization with similar rates to their full SVD–based counterparts. Finally, we bring empirical experiments that both support our theoretical findings and demonstrate the practical appeal of our methods.

Original languageEnglish
Pages (from-to)2094-2128
Number of pages35
JournalMathematics of Operations Research
Issue number4
StatePublished - 2023


  • first order methods
  • low-rank matrix recovery
  • low-rank optimization
  • matrix exponentiated gradient

ASJC Scopus subject areas

  • General Mathematics
  • Computer Science Applications
  • Management Science and Operations Research


Dive into the research topics of 'On the Efficient Implementation of the Matrix Exponentiated Gradient Algorithm for Low-Rank Matrix Optimization'. Together they form a unique fingerprint.

Cite this