AI Lab logo
menu MENU

Theory Seminar

High-dimensional covariance estimation based on Gaussian graphical models

Shuheng ZhouU-M
SHARE:

Undirected graphs are often used to describe high dimensional distributions. Under sparsity conditions, the graph can be estimated using Â,“1-penalization methods. This talk presents the following method. We combine a multiple regression approach with ideas of thresholding and refitting: first we infer a sparse undirected graphical model structure via thresholding of each among many Â,“1-norm penalized regression functions; we then estimate the covariance matrix and its inverse using the maximum likelihood estimator. Under suitable conditions, this approach yields consistent estimation in terms of graphical structure and fast convergence rates with respect to the operator and Frobenius norm for the covariance matrix and its inverse. We also derive an explicit bound for the Kullback Leibler divergence. This is joint work with Philipp Rutimann, Min Xu, and Peter Buhlmann.

Sponsored by

EECS