I am trying to generate uber-jar using sbt
compile and sbt
package commands for running my application on our remote server with spark installed as standalone mode there. I used deeplearning4j framework for building LSTM neural network and tend to perform training model through spark. Nevertheless, I got into an issue when running the spark-submit command:
spark-submit --class "lstm.SparkLSTM" --master local[*]
stock_prediction_scala_2.11-0.1.jar --packages
org.deeplearning4j:deeplearning4j-core:0.9.1 "/home/hadoop/ScalaWorkspace/Stock_Prediction_Scala/target/lstm_train/prices-split-adjusted.csv" "WLTW"
The problem is that seemly spark-submit did not take effect in my circumstance. It has been terminated right after entering spark-submit without throwing any errors. I have not seen the progress of training in the output.
[hadoop@abc lstm_train]$ spark-submit --class "lstm.SparkLSTM" --master local[*] stock_prediction_scala_2.11-0.1.jar --packages org.deeplearning4j:deeplearning4j-core:0.9.1 "/home/hadoop/ScalaWorkspace/Stock_Prediction_Scala/target/lstm_train/prices-split-adjusted.csv" "WLTW"
2018-04-25 17:06:50 WARN Utils:66 - Your hostname, gaion34 resolves to a loopback address: 127.0.0.1; using 192.168.0.173 instead (on interface eno1)
2018-04-25 17:06:50 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-04-25 17:06:51 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-04-25 17:06:51 INFO ShutdownHookManager:54 - Shutdown hook called
2018-04-25 17:06:51 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-c4aee15e-d23b-4c03-95a7-12d9d39f714a
my main class: https://gist.github.com/rickyhai11/627d0da8bc93615785382b249618f43b
How to see generated logs by spark-submit command? I have tried to use --verbose, however, it did not help.
Any one has experienced this issue before, please advise me . thanks