I am working with the L-BFGS-B implementation of the scipy.optimize.minimize package. I know that when using a gradient (i.e. using the argument jac=...), the algorithm computes the corresponding step size by line search (Wolfe-condition).
Does anyone know if there is a way to extract this step size from the algorithm or its results (to e.g. later store it in a csv-file)?