LARS (homotopy methods)¶

Active set methods update many variables at a time. Unfortunately, they are complicated, because of the need to identify which variables are constrained to be zero, and which are free to be updated. Active set methods typically only add or remove a few variables at a time, so they can take a long if they are started far from the solution. These algorithms exploint the fact that one can quickly compute \(\hat{w}(\lambda_k)\) from \(\hat{w}(\lambda_{k-1})\) if \(\lambda_k \approx \lambda_{k-1}\), this is known as warm starting. Sometimes it can be computationaly more effective to start at \(\lambda_{max}\) and gradually decrease lambda using warm starting, this is called continuation method or homotropy method.

LARS (Least angle regression and shrinkage) starts with a large value of \(\lambda\), such that only the variable that is most correlated with the response vector y is chosen. Then \(\lambda\) is decreased until a second variable is found which has the same correlation (in terms of magnitude) with the current residual as the first variable, where the residual at step k is defined as \(r_k = y - X_{:, F_k}w_k\) where \(F_k\) is the current active set. Remarkably, one can solve for this new value of \(\lambda\) analytically, by using a geometric argument (hence the term “least angle”). This allows the algorithm to quickly “jump” to the next point on the regularization path where the active set changes. This repeats until all the variables are added.