I'm using Spark 1.4.0 (PySpark). I have a DataFrame loaded from Hive table using this query:
sqlContext = HiveContext(sc)
table1_contents = sqlContext.sql("SELECT * FROM my_db.table1")
When I attempt to insert data from table1_contents
after some transformations, into table2 using DataFrameWriter#insertInto function:
sqlContext.createDataFrame(transformed_data_from_table1).write.insertInto('my_db.table2')
I encounter this error:
py4j.protocol.Py4JJavaError: An error occurred while calling o364.insertInto.
: org.apache.spark.sql.AnalysisException: no such table my_db.table2;
I know my table is existing because when I type:
print sqlContext.tableNames('my_db')
table1 and table2 are displayed. Can anyone help about this issue?