0

Being new to both Java and Spark, I need to use Cholesky decomposition in my code and I found something a bit surprising. Spark MLlib offers a CholeskyDecomposition class but the methods only propose to invert and solve, based on an already decomposed matrix. Am I supposed to use a third party library to perform this decomposition and then use Spark?

Or am I understanding it wrong and should i use something like that, which I have already tried but without success (sorry if this is a non sense, as I said I am a newbie):

import org.apache.spark.mllib.linalg.CholeskyDecomposition;
import org.apache.spark.mllib.linalg.DenseMatrix;
import org.apache.spark.mllib.linalg.Matrices;

DenseMatrix M = (DenseMatrix) Matrices.dense(size,size,data); //double[] data previously defined
CholeskyDecomposition f = new CholeskyDecomposition();
DenseMatrix U = f(M);

But once again, this does not work as I may have understood that CholeskyDecomposition is only a constructor and I can not use it as a class.

Thanks for any tip!

NB: I've already thoroughly checked Spark documentations in case I was missing something, but this seems to be the only relevant class

Gauthier
  • 1
  • 1

0 Answers0