I want a concat function for Spark Sql. I have written a udf as
sqlContext.udf.register("CONCAT",(args:String*)=>{
String out=""
for(arg<-args)
{
out+=arg
}
out
})
sqlContext.sql("select col1,col2,CONCAT(col1,col2) from testtable")
but this udf is not working and I am getting an exception. If I try with fixed number of parameters then it works. I am using spark 1.3.1 and scala 2.10.5.
Has anyone faced this issue or knows a solution for this?