A significance test for LASSO

I list Tibshirani’s paper describing his Least Absolute Shrinkage and Selection Operator (LASSO) method on my Primary Sources page.  Full disclosure:  I’ve never had occasion to use it myself but I’ve seen it used to productive effect.  Sparsity-promoting ridge regression is just a really smart and (in many circumstances) useful concept.   Not too many know about it currently but in 100 years (probably sooner) it will be standard content in stat textbooks.  Because it’s not yet widely known it’s worth calling attention both to the method and to how it’s implemented.  Tibshirani published his original paper in 1996.  The big deal is that he and several colleagues have developed a significance test for predictor variables.  From Andrew Gelman’s blog:

 I received the following email from Rob Tibshirani:

Over the past few months I [Tibshirani] have been consumed with a new piece of work [with Richard Lockhart, Jonathan Taylor, and Ryan Tibshirani] that’s finally done. Here’s the paper, the slides (easier to read), and the R package.

I’m very excited about this. We have discovered a test statistic for the lasso that has a very simple Exp(1) asymptotic distribution, accounting for the adaptive fitting. In a sense, it’s the natural analogue of the drop in RSS chi-squared (or F) statistic for adaptive regression!

It also could help bring the lasso into the mainstream. It shows how a basic adaptive (frequentist) inference—difficult to do in standard least squares regression, falls out naturally in the lasso paradigm.

This is a big deal.  If you do regression analyses in your work then check it out.  (If you’re not familiar with ridge regression, a.k.a. regularized regression, then best to check that out first.)