0

I have a column "features" which is a vector. Is there a way to convert this Vector column to Array column? I am using Spark 2.3 and Java. Actually, the final objective is to split the Vector into individual columns. Thank you.

10465355
  • 4,481
  • 2
  • 20
  • 44
Sam
  • 133
  • 3
  • 12

1 Answers1

2

This can be done with UserDefinedFunction. You can define one like this:

import org.apache.spark.sql.types.*;
import org.apache.spark.sql.expressions.UserDefinedFunction;
import static org.apache.spark.sql.functions.*;

UserDefinedFunction toarray = udf(
  (Vector v) -> v.toArray(),  new ArrayType(DataTypes.DoubleType, false)
);

and then apply it on a Column:

import org.apache.spark.sql.Column;

Column featutesArray = toarray.apply(col("features"));

where the result can be used with select or withColumn.

the final objective is to split the Vector into individual columns.

That's just a matter of simple indexing - Spark Scala: How to convert Dataframe[vector] to DataFrame[f1:Double, ..., fn: Double)]

10465355
  • 4,481
  • 2
  • 20
  • 44