ECE Seminar: Preconditioning For Consistency In Sparse Inference

Feb 22

This event has passed.

Friday, February 22, 2013 - 11:45am to 1:00pm


Karl Rohe, Ph.D., Assistant Professor, Department of Statistics, University ofWisconsin-Madison

Preconditioning is a technique from numerical linear algebra that can accelerate algorithms to solve systems of equations. This talk will discuss how preconditioning can also improve the statistical estimation performance in sparse high dimensional regression (aka compressed sensing). Specifically, the talk will demonstrate how preconditioning can circumvent three stringent assumptions for various types of consistency in sparse linear regression. Given X ^{n x p} and Y ^n that satisfy the standard linear regression equation Y = X beta + epsilon, this paper demonstrates that even if the design matrix X does not satisfy the irrepresentable condition, the restricted eigenvalue condition, or the restricted isometry property, the design matrix FX often does, where F ^{n x n} is a specific preconditioning matrix that will be defined in the talk. By computing the Lasso on (FX, FY), instead of on (X,Y), the necessary assumptions on X become much less stringent. Crucially, left multiplying the regression equation by X does not change beta, the vector of unknown coefficients. This talk only requires linear algebra (through svd)as a prerequisite. This represents joint work with Jinzhu Jia at Peking University Bio: Graduated from Berkeley Statistics department in 2011, working with Bin Yu on topics related to network inference and the Lasso. Since then, assistant professor at UW Madison, continuing to work on problems in network inference and the Lasso.