7

When I execute:

sqoop import --connect jdbc:mysql://localhost/testdb --table test --hive-table test --hive-import -m 1 

I get following error message:

13/04/21 16:42:50 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 1
    at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:364)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:314)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:226)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

I tried to google for it but found no solution. I have Hadoop set up locally in pseudo-distributed fashion. Hive is running fine ... I used the embedded metastore. Any ideas how to fix this? Thanks, Diddy

Diddy
  • 193
  • 1
  • 3
  • 11
  • Please share entire log output generated with parameter --verbose. – Jarek Jarcec Cecho Apr 22 '13 at 02:52
  • Please find the log here: [pastpin](http://www.pastebin.com/HRMZiygh) – Diddy Apr 23 '13 at 16:25
  • ... some more info: export from mysql directly to hdfs works without problems, it's only when I try to export to hive that I get this error. So my feeling is that this has nothing to do with missing mysql driver or insufficient rights to the mysql table. – Diddy Apr 23 '13 at 16:39

4 Answers4

10

Based on the log it seems that you're hitting following exception:

13/04/22 18:34:44 INFO hive.HiveImport: Exception in thread "main" java.lang.NoSuchMethodError: org.apache.thrift.EncodingUtils.setBit(BIZ)B

I've seen this issue before when users were using HBase and Hive in "incompatible" versions. The incompatibility can be generally on multiple levels, but this particular one is when HBase and Hive are using different thrift versions. As Sqoop is adding both HBase and Hive jars to the classpath, only one thrift version can be active and thus the "second" tool (usually hive) is not working properly.

Do you by any chance installed both HBase and Hive on the box where you're executing Sqoop? If so, can you check the thrift version that each project is using? Just search for "*thrift*.jar". If answer to both questions is positive, then you could potentially set HBASE_HOME to something non-existing to force Sqoop not load HBase's version of thrift.

Jarek Jarcec Cecho
  • 1,736
  • 1
  • 10
  • 13
  • Many thanks for your help! Yes, I have both HBase and Hive set up on my local dev environment. My search revealed the hbase-0.94.6.1 uses libthrift-0.8.0.jar whereas hive-0.10.0 is using libthrift-0.9.0.jar. As per your recommendation I set HBASE_HOME to a non-existing path and ... excellent, now everything is working! Thanks a lot again for your massive help! – Diddy Apr 24 '13 at 07:22
  • May i know how do i set HBASE_HOME to a non-existing path ? – Chandu Feb 27 '19 at 06:22
2

I agree with Jarek Jarcec Cecho's answer.

Another workaround is to copy libthrift*.jar file from $HIVE_HOME/lib to $SQOOP_HOME/lib.

Luís Cruz
  • 14,780
  • 16
  • 68
  • 100
Arun_kg
  • 21
  • 1
0

There might be permission issue for mysql table or you are missing mysql connector jar in hive lib. Please share entire output of the command

Sourav Gulati
  • 1,359
  • 9
  • 18
  • Thanks for your reply. Please find the log extract here: http://pastebin.com/HRMZiygh. I can export to HDFS without problems, just not to Hive, so the problem isn't with a missing driver or MySQL permissions. – Diddy Apr 22 '13 at 17:46
0

I faced the same error ,This would be definitely related to libthrift-0.8.0.jar and libthrift-0.9.0.jar's in hbase and hive respectively. So I simply pointed my HBASE_HOME to unknown location and exec my bash .Then it is fine.

But Here is a problem I'm facing with Hive is the table I tried to import which is failed now says the table already exits but it actually not. I guess the metastore db files would have already written !So it hapend for me to change the name of table and redo the command in case we don't want to delete whole folder of metastoredb.

user3175547
  • 1
  • 1
  • 2