I am setting up a spark standalone cluster and creating a spark session. The following scala code has been used to create a spark session:
val session = SparkSession.builder()
.master("spark://master_ip:7077")
.getOrCreate()
I have also changed spark-env.sh
to specify SPARK_MASTER_HOST
in both master and slave machines. But the code is not running and is throwing the following error/stack trace:
9/11/23 12:46:53 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://master_ip:7077...
19/11/23 12:46:53 INFO TransportClientFactory: Successfully created connection to /master_ip:7077 after 15 ms (0 ms spent in bootstraps)
19/11/23 12:47:13 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://master_ip:7077...
19/11/23 12:47:33 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://master_ip:7077...
19/11/23 12:47:53 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
19/11/23 12:47:53 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
19/11/23 12:47:53 INFO SparkUI: Stopped Spark web UI at http://localhost:4040
19/11/23 12:47:53 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46455.
19/11/23 12:47:53 INFO NettyBlockTransferService: Server created on localhost:46455
19/11/23 12:47:53 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/11/23 12:47:53 INFO StandaloneSchedulerBackend: Shutting down all executors
19/11/23 12:47:53 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
19/11/23 12:47:53 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
19/11/23 12:47:53 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/11/23 12:47:53 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, localhost, 46455, None)
19/11/23 12:47:53 INFO MemoryStore: MemoryStore cleared
19/11/23 12:47:53 INFO BlockManager: BlockManager stopped
19/11/23 12:47:53 INFO BlockManagerMasterEndpoint: Registering block manager localhost:46455 with 1929.9 MB RAM, BlockManagerId(driver, localhost, 46455, None)
19/11/23 12:47:53 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, localhost, 46455, None)
19/11/23 12:47:53 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, localhost, 46455, None)
19/11/23 12:47:53 INFO BlockManagerMaster: BlockManagerMaster stopped
19/11/23 12:47:53 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/11/23 12:47:53 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
at scala.Predef$.require(Predef.scala:281)
at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:935)
at scala.Option.getOrElse(Option.scala:138)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at query.rewrite.QueryRewriteDemo1$.main(QueryRewriteDemo1.scala:12)
at query.rewrite.QueryRewriteDemo1.main(QueryRewriteDemo1.scala)
19/11/23 12:47:53 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
at scala.Predef$.require(Predef.scala:281)
at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:935)
at scala.Option.getOrElse(Option.scala:138)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at query.rewrite.QueryRewriteDemo1$.main(QueryRewriteDemo1.scala:12)
at query.rewrite.QueryRewriteDemo1.main(QueryRewriteDemo1.scala)
I even checked this solution here. But it doesn't seem that is the problem as the versions at both ends are same for me(2.4.4). How can I identify the issue here?