Constrained Low-Rank Matrix Approximations: Theoretical and Algorithmic Developments for Practitioners
-
TypeDoctorate Post-doctorate
-
KeywordsOptimization, low-rank matrix approximations, numerical linear algebra, data mining, computational complexity
Description
Low-rank matrix approximation (LRA) techniques such as principal component analysis (PCA) are powerful tools for the representation and analysis of high dimensional data, and are used in a wide variety of areas such as machine learning, signal and image processing, data mining, and optimization. In recent years, many variants of LRA have been introduced, using different constraints on the factors and using different objective functions to assess the quality of the approximation; e.g., sparse PCA, PCA with missing data, independent component analysis and nonnegative matrix factorization. Although these new constrained LRA models have become very popular and standard in some fields, there is still a significant gap between theory and practice. Our goal is to reduce this gap by attacking the problem in an integrated way making connections between LRA variants, and by using four very different but complementary perspectives: (1) computational complexity issues, (2) provably correct algorithms, (3) heuristics for difficult instances, and (4) application-oriented aspects .