Dimension selection with independent component analysis and its application to prediction
Seminar Room 1, Newton Institute
We consider the problem of selecting the best or most informative dimension for dimension reduction and feature extraction in high-dimensional data. We review current methods, and propose a dimension selector based on Independent Component Analysis which finds the most non-Gaussian lower-dimensional directions in the data. A criterion for choosing the optimal dimension is based on bias-adjusted skewness and kurtosis. We show how this dimension selector can be applied in supervised learning with independent components, both in a regression and classification framework.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.