21 Feb 2017, 12:00 - 14:00
John Crank - Room 128
02/21/2017 12:00 PM
02/21/2017 02:00 PM
Europe/London
Exploring and exploiting new structured classes of covariance and inverse covariance matrices
Estimation of covariance and inverse covariance (precision) matrices is an essential ingredient to virtually every modern statistical procedure
John Crank - Room 128
Seminar organiser: Dr Alex Lewin
Seminar organiser: Dr Alex Lewin
Share this
Join our mailing list
Speaker: Heather Battey (Department of Mathematics, Imperial College)
Abstract
Estimation of covariance and inverse covariance (precision) matrices is an essential ingredient to virtually every modern statistical procedure. When the dimension, p, of the covariance matrix is large relative to the sample size, the sample covariance matrix is inconsistent in non-trivial matrix norms, and its non-invertibilty renders many techniques in multivariate analysis impossible. Structural assumptions are necessary in order to restrain the estimation error, even if this comes at the expense of some approximation error if the structural assumptions fail to hold. I will introduce new structured model classes for estimation of large covariance and precision matrices. These model classes result from imposing sparsity in the domain of the matrix logarithm. After studying the structure induced in the original and inverse domains, I will then introduce estimators of both the covariance and precision matrix that exploit this structure. I derive the convergence rates of these estimators and show that they achieve a new minimax lower bound over classes of covariance and precision matrices whose matrix logarithm is sparse. The implication of this result is that the estimators are efficient and the minimax lower bound is sharp.
Seminar organiser: Dr Alex Lewin