1

When starting pyspark on the command line using pyspark, everything works as expected. However, when using Livy, it doesn't.

I made the connection using Postman. First I POST this to the sessions endpoint:

{
  "kind": "pyspark",
  "proxyUser": "spark"
}

The session spins up, I can see Spark getting started on YARN. However, I get this error in my container log:

18/09/12 15:53:00 ERROR repl.PythonInterpreter: Process has died with 1
18/09/12 15:53:00 ERROR repl.PythonInterpreter: Traceback (most recent call last):
  File "/yarn/nm/usercache/livy/appcache/application_1535188013308_0051/container_1535188013308_0051_01_000001/tmp/3015653701235928503", line 643, in <module>
    sys.exit(main())
  File "/yarn/nm/usercache/livy/appcache/application_1535188013308_0051/container_1535188013308_0051_01_000001/tmp/3015653701235928503", line 533, in main
    exec('from pyspark.shell import sc', global_dict)
  File "<string>", line 1, in <module>
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/shell.py", line 38, in <module>
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/context.py", line 292, in _ensure_initialized
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/java_gateway.py", line 47, in launch_gateway
  File "/usr/lib64/python2.7/UserDict.py", line 23, in __getitem__
    raise KeyError(key)
KeyError: 'PYSPARK_GATEWAY_SECRET'

The output from sessions/XYZ/log is:

{
    "id": 16,
    "from": 0,
    "total": 46,
    "log": [
        "stdout: ",
        "\nstderr: ",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/livy-api-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/livy-rsc-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/netty-all-4.0.29.Final.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/commons-codec-1.9.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/livy-core_2.11-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/livy-repl_2.11-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/lama/lama.main-assembly-0.9.0-spark2.3.0-hadoop2.6.5-SNAPSHOT.jar.",
        "18/09/12 15:52:50 INFO client.RMProxy: Connecting to ResourceManager at master1.lama.nuc/192.168.42.100:8032",
        "18/09/12 15:52:51 INFO yarn.Client: Requesting a new application from cluster with 6 NodeManagers",
        "18/09/12 15:52:51 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (12288 MB per container)",
        "18/09/12 15:52:51 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead",
        "18/09/12 15:52:51 INFO yarn.Client: Setting up container launch context for our AM",
        "18/09/12 15:52:51 INFO yarn.Client: Setting up the launch environment for our AM container",
        "18/09/12 15:52:51 INFO yarn.Client: Preparing resources for our AM container",
        "18/09/12 15:52:51 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/livy-api-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/livy-rsc-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/netty-all-4.0.29.Final.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/commons-codec-1.9.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/livy-core_2.11-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/livy-repl_2.11-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/lama/lama.main-assembly-0.9.0-spark2.3.0-hadoop2.6.5-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Uploading resource file:/tmp/spark-37413ebc-9427-44d8-8a01-c4222eb899f8/__spark_conf__7516701035111969209.zip -> hdfs://master1.lama.nuc:8020/user/livy/.sparkStaging/application_1535188013308_0051/__spark_conf__.zip",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing view acls to: livy",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing modify acls to: livy",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing view acls groups to: ",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing modify acls groups to: ",
        "18/09/12 15:52:53 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(livy); groups with view permissions: Set(); users  with modify permissions: Set(livy); groups with modify permissions: Set()",
        "18/09/12 15:52:57 INFO yarn.Client: Submitting application application_1535188013308_0051 to ResourceManager",
        "18/09/12 15:52:57 INFO impl.YarnClientImpl: Submitted application application_1535188013308_0051",
        "18/09/12 15:52:57 INFO yarn.Client: Application report for application_1535188013308_0051 (state: ACCEPTED)",
        "18/09/12 15:52:57 INFO yarn.Client: ",
        "\t client token: N/A",
        "\t diagnostics: N/A",
        "\t ApplicationMaster host: N/A",
        "\t ApplicationMaster RPC port: -1",
        "\t queue: root.users.livy",
        "\t start time: 1536760377659",
        "\t final status: UNDEFINED",
        "\t tracking URL: http://master1.lama.nuc:8088/proxy/application_1535188013308_0051/",
        "\t user: livy",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Shutdown hook called",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-795d9b05-a5ad-4930-ad8b-77034022bc17",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-37413ebc-9427-44d8-8a01-c4222eb899f8",
        "\nYARN Diagnostics: "
    ]
}

What is wrong here? Using CDH 5.15.0 with Parcels and Spark2. Using Scala works with no problems.

Followup

I set the deployment mode from cluster to client. The KeyError goes away, but when trying to run even a simple sc.version I get Interpreter died with no traceback or error whatsoever.

rabejens
  • 7,594
  • 11
  • 56
  • 104

1 Answers1

0

I faced the same issue and solved it by upgrading to Livy 0.5.0.

Apparently CDH 5.15.0 has a fix for a security vulnerability (CVE-2018-1334), which introduced an incompatibility with Livy <0.5.0. Credit goes to Marcelo Vanzin for posting this in the livy-user mailing list archives.