I'm looking to do some simple convex optimization in Haskell; nothing too fancy, just minimizing linear functions with some quadratic constraints. I've come across HVX and am just wondering if that is the best package for this purpose or if there is a standard alternative. Thanks!
Asked
Active
Viewed 245 times
1
-
2HVX appears to be an unmaintained student project never uploaded to Hackage. Feel free to use it, but don't expect any kind of support. – dfeuer Jul 09 '20 at 17:08
1 Answers
0
The ad
package has several helpful routines and a gradient descent operation or two. For example:
$ cabal install --pacakge-env . --lib ad
$ ghci
> import Numeric.AD
> take 1 $ drop 10000 $ gradientDescent (\[x] -> (x-1)^2 + 3) [10]
[[1.0000000000377052]]
> conjugateGradientDescent (\[x] -> (x-1)^2 + 3) [10]
[[10.0],[1.0]]
We see the function we are minimizing, the inputs as a traversable (list in this case), and an initial starting point [10]
. This is the package that underlies the optimization
package and should provide a nice start even if it doesn't solve your problem outright.

Thomas M. DuBuisson
- 64,245
- 7
- 109
- 166