Anonymous
Lasso and Ridge are both regularization methods which adds penalties to the loss function in linear regression. Ridge uses L2 norm which is the sum of the squared of the coef to penalize the coefficient while Lasso uses L1 which is the sum of the absoulte values of coefficients to penalize. Lasso can shrink coefficients towards 0 to perform feature selection while Ridge cannot. Ridge is best to use when your predictors has some collinearity, though some weak predictors but you still want to keep them all. Lasso is mainly used for feature selection, it's best to use when you have a lot of predictiors and you suspect that some of them are not relevant.