skip to content

Inference for eigenstructure of high-dimensional covariance matrices

Presented by: 
Jana Jankova
Tuesday 15th May 2018 - 11:00 to 12:00
INI Seminar Room 2
Sparse principal component analysis (PCA) has become one of the most widely used techniques for dimensionality reduction in high-dimensional datasets. The main challenge underlying sparse PCA is to estimate the first vector of loadings of the population covariance matrix, provided that
only a certain number of loadings are non-zero. A vast number of methods have been proposed in literature for point estimation of eigenstructure of the covariance matrix. In this work, we study uncertainty quantification and propose methodology for inference and hypothesis testing for individual loadings and for the largest eigenvalue of the covariance matrix. We base our methodology on a Lasso-penalized M-estimator which, despite non-convexity, may be solved by a polynomial-time algorithm such as coordinate or gradient descent. Our results provide theoretical guarantees for asymptotic normality of the new estimators and may be used for valid hypothesis testing and variable selection. 

University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons