I am very new to Google DataProc
We want to run set of code via spark-shell or sparkR for testing purposes. Is it possible to connect to spark cluster and execute the commands in spark-shell or sparkR in google DataProc?
I checked the doc and it seems we can submit jobs using spark-submit but I don't find information related to spark-shell or SparkR.