I am running gapply function on SparkRDataframe which looks like below
df<-gapply(sp_Stack, function(key,e) {
Sys.setlocale('LC_COLLATE','C')
suppressPackageStartupMessages({
library(Rcpp)
library(Matrix)
library(reshape)
require(parallel)
require(lubridate)
library(plyr)
library(reticulate)
library(stringr)
library(data.table)
})
calcDecsOnly(e,RequestNumber=RequestNumber,
...)
},cols="udim",schema=schema3)
The above code runs without any errors if we set spark.sql.execution.arrow.sparkr.enabled = "false" , but if I set spark.sql.execution.arrow.sparkr.enabled = "true" the spark Job fails with below error
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at org.apache.spark.sql.execution.r.ArrowRRunner$$anon$2.read(ArrowRRunner.scala:154)
Environment : Google Cloud Dataproc Spark version : 3.1.1 Dataproc version : Custom image built on 2.0.9-debian10
Any help here is appreciated , thanks in advance