First, your question is ill-posed because there exist many algorithms to solve the Lasso.
The most popular right now is coordinate descent. Here's the skeleton of the algo (without stopping criterion). I have used numba/jit because for loops can be slow in python.
import numpy as np
from numba import njit
@njit
def ST(x, u):
"Soft thresholding of x at level u"""
return np.sign(x) * np.maximum(np.abs(x) - u, 0.)
@njit
def cd_solver(X, y, alpha, max_iter):
n_samples, n_features = X.shape
beta = np.zeros(n_features)
R = y.copy() # residuals y - X @ beta
lc = (X ** 2).sum(axis=0) # lipschitz constants for coordinate descent
for t in range(max_iter):
for j in range(n_features):
old = beta[j]
beta[j] = ST(old + X[:, j].dot(R) / lc[j], alpha / lc[j])
# keep residuals up to date
if old != beta[j]:
R += (old - beta[j]) * X[:, j]
# I'll leave it up to you to implement a proper stopping criterion
return beta
X = np.random.randn(100, 200)
y = np.random.randn(100)
if not np.isfortran(X):
X = np.asfortranarray(X)
alpha_max = np.max(np.abs(X.T.dot(y)))
cd_solver(X, y, alpha_max / 2., 100)
You can also try with proximal gradient/ISTA, but from my experience it's way slower than CD.