I'm having issues starting Apache Spark via the start-master.sh script. I've searched around and cant seem to find anyone who has run into this issue before:
Here's the full error I'm getting at start up (logs list the same error, nothing additional):
failed to launch org.apache.spark.deploy.master.Master:
/usr/local/spark/bin/spark-class: line 87: exec: -X: invalid option
exec: usage: exec [-cl] [-a name] [command [arguments ...]] [redirection ...]
Looking at /usr/local/spark/bin/spark-class, specifically lines 80-87, I see the following:
# The launcher library will print arguments separated by a NULL character, to allow arguments with
# characters that would be otherwise interpreted by the shell. Read that in a while loop, populating
# an array that will be used to exec the final command.
CMD=()
while IFS= read -d '' -r ARG; do
CMD+=("$ARG")
done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$@")
exec "${CMD[@]}"
I'm not very familiar with "exec" but I definitely dont see any "-X" flag being passed here as an option.
I found this answer here on StackOverflow that details what the Spark start-up process entails, so it looks like my issue is the hand-off from bash to java in terms of the command's control. Again, I dont quite know enough about "exec" to fully understand what's happening here or how to even begin fixing this.
For what it's worth, I did check to ensure that my JAVA_HOME was set correctly, which I beleive it is.
Has anyone seen this issue before or would know how to start to fix this?