skip to content
 

The tensor graphical lasso (Teralasso)

Presented by: 
Alfred Hero University of Michigan
Date: 
Tuesday 31st October 2017 - 14:00 to 14:50
Venue: 
INI Seminar Room 1
Abstract: 
Co-authors: Kristjian Greenewald (Harvard University), Shuheng Zhou (University of Michigan), Alfred Hero (University of Michigan)

We propose a new ultrasparse graphical model for representing multiway data based on a Kronecker sum representation of the process inverse covariance matrix. This statistical model decomposes the inverse covariance into a linear Kronecker sum representation with sparse Kronecker factors.

Under the assumption that the multiway observations are matrix-normal the l1 sparsity regularized log-likelihood function is convex and admits significantly faster statistical rates of convergence than other sparse matrix normal algorithms such as graphical lasso or Kronecker graphical lasso.

We specify a scalable composite gradient descent method for minimizing the objective function and analyze both the statistical and the computational convergence ratesm, showing that the composite gradient descent algorithm is guaranteed to converge at a geometric rate to the global minimizer. We will illustrate the method on several real multiway datasets, showing that we can recover sparse graphical structures in high dimensional data.

Related Links
The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons