I'm testing Spark using java and run into some problems when running my program using Eclipse.
The test code is the following :
package projet1;
import org.apache.log4j.Level;
import org.apache.log4j.Logger;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
public class projet1 {
public static void main(String[] args) {
System.out.println("Hello world");
System.setProperty("hadoop.home.dir", "/home/user1/Spark_Projects");
Logger.getLogger("org.apache").setLevel(Level.WARN);
SparkConf conf = new SparkConf().setAppName("SiravPg").setMaster("local[*]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> myRDD = sc.textFile("src/main/ressources/data.txt")
sc.close();
}
}
When running this code, it seems that eclipse doesn't detect my main() function and displays a new Window asking me to Select a Java Application
PS : The "System.out.println("Hello world");" is running correctly.