0

I'm looking to execute Scala code in text file using GCP with Spark Shell.

Using GCP (Google Cloud Platform), I've done the following:

  1. Created a DataProc instance and named it gcp-cluster-091122.

  2. Created a Cloud Bucket and named it gcp-bucket-091122p.

  3. Created a simple text file called 1.txt and uploaded the file into the recently created GCP bucket, gcp-bucket-091122.

  4. Logged onto the VM-instance SSH-in-browser and entered the command spark-shell to access the scala > prompt.

From here, how does one read/execute a particular file uploaded into a GCP bucket? I've researched this topic, but I've been unsuccessful.


I've also used "GCS Fuse" plug-in code to successfully mount the GCP bucket, gcp-bucket-091122 onto a created local file directory in SSH called lfs-directory-091122.

So an additional question would be how to execute a file located in the local file directory using Spark Shell?

Rogelio Monter
  • 1,084
  • 7
  • 18
Rashad Nelson
  • 27
  • 1
  • 5

0 Answers0