as the title suggests, is there any repository where somebody has implemented this kind of constrained optimization method in (py)Spark?
The thing is the project I am working on has got several steps and, while nearly all of them can be done natively in Spark, this one presents a bit of a pickle because, as much as I can code up in (py)Spark the "outer" functions of such method, the "core code", namely the "slsqp" function is apparently written in Fortran
which is way beyond my area of expertise (or anybody's really in my company).
p.s.: I cannot install scipy
in the cluster as the company I am in right now will most likely not allow it (not at least in a timely matter)