1

Am trying to run pyspark on yarn-client, am not sure what might be reason and the can't interpret the logs correctly

import sys
from pyspark.sql import SparkSession
from pyspark import SparkContext, SparkConf
conf = SparkConf()
conf.setMaster('yarn-client')
conf.setAppName('SPARK APP')
sc = SparkContext(conf=conf)
# sc= SparkContext.getOrCreate()
# sc.stop()

def mod(x):
    import numpy as np
    return (x, np.mod(x, 2))

rdd = sc.parallelize(range(1000)).map(mod).take(10)
print (rdd)

the code keeps throwing the below exception

Diagnostics: File file:/home/sw/.sparkStaging/application_1549971830990_0008/__spark_libs__3625483651625656288.zip does not exist
java.io.FileNotFoundException: File file:/home/sw/.sparkStaging/application_1549971830990_0008/__spark_libs__3625483651625656288.zip does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:598)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:811)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:588)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:432)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:251)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:61)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:364)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:362)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:60)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Exorcismus
  • 2,243
  • 1
  • 35
  • 68

1 Answers1

0

The issue was solved when I gave the user ownership and permissions to that path

chown sw /home/sw/.sparkStaging/
Exorcismus
  • 2,243
  • 1
  • 35
  • 68