Cannot simply load spark dataframe into elasticsearch as written here here (es is running on localhost). What am I missing?
conf = SparkConf().setAppName("product_recommendation-server") \
.set('spark.jars', 'path/to/elasticsearch-hadoop-2.1.0.Beta2.jar') \
.set('spark.driver.memory', '2250m') \
.set('spark.sql.shuffle.partitions', '2000')
sc = SparkContext(conf=conf)
spark = SparkSession(sc)
sql_sc = SQLContext(sc)
spark_df = spark.read.csv(path, header=True)
spark_df.write.save(format="org.elasticsearch.spark.sql")
Py4JJavaError: An error occurred while calling o77.save. : java.lang.ClassNotFoundException: Failed to find data source: org.elasticsearch.spark.sql. Please find packages at http://spark.apache.org/third-party-projects.html