1

I am using Hive 1.2.1 and Spark 1.6, and the issue is I am unable to do a simple delete operation in a Hive table using the spark shell. Since hive supports ACID since 0.14, I was hoping it would be allowed in Spark.

 16/01/19 12:44:24 INFO hive.metastore: Connected to metastore.


 scala> hiveContext.sql("delete from testdb.test where id=2");


 16/01/19 12:44:51 INFO parse.ParseDriver: Parsing command: delete from    
 testdb.test where id=2
 16/01/19 12:44:52 INFO parse.ParseDriver: Parse Completed

 org.apache.spark.sql.AnalysisException:
 Unsupported language features in query: delete from testdb.test where id=2
 TOK_DELETE_FROM 1, 0,12, 12
   TOK_TABNAME 1, 4,6, 12
    testdb 1, 4,4, 12
     test 1, 6,6, 19
     ......

 scala.NotImplementedError: No parse rules for TOK_DELETE_FROM:
 TOK_DELETE_FROM 1, 0,12, 12
 TOK_TABNAME 1, 4,6, 12
  testdb 1, 4,4, 12
  ......
sparkDabbler
  • 518
  • 2
  • 7
  • 20

1 Answers1

1

You could run Hive via the command line from inside Scala.

import scala.sys.process._
val cmd = "hive -e \"delete from testdb.test where id=2\"" // Your command
val output = cmd.!! // Captures the output

See also Execute external command

Community
  • 1
  • 1
maxymoo
  • 35,286
  • 11
  • 92
  • 119
  • The command that you have provided is correct and runs from command line, however it gives a ParseException when it is run from the spark-shell. FAILED: ParseException line 1:3 cannot recognize input near '' '' '' in switch database statement java.lang.RuntimeException: Nonzero exit value: 64 – sparkDabbler Jan 20 '16 at 15:06