LARS is a model-selection algorithm, a useful and less greedy version of traditional forward-selection methods.
LASSO is a version of ordinary least squares (OLS) that constrains the sum of the absolute regression coefficients. LASSO is an important sparse learning method. LASSO estimates the coefficients for each input variable and uses them to make predictions for the response variables. LASSO does the fitting in a smarter way than ordinary least squares: It always finds the most significant variables (those that have the greatest absolute correlation with the current residuals) in a sequential manner. That is, it performs the variable selection job.
The LASSO form of fitting is very efficient when you have thousands of input variables. The time complexity is the same as running linear regression, which is linear in the number of rows. In addition, LASSO can work in some situations where ordinary least squares cannot, such as when there is multicollinearity.
For more information about LARS, see Least Angle Regression, Bradley Efron and others, Department of Statistics, Stanford University, 2004.