0

I have imported a table from MySql into HDFS using the --as-sequencefile option. Then I created a Hive table with STORED AS SEQUENCEFILE clause and LOCATION clause that points to the HDFS location where the Sqoop imported sequence files are present.

Sqoop import command:

sqoop import --connect jdbc:mysql://sandbox.hortonworks.com:3306/hirw --username root --password hadoop --table stocks -m 2 --as-sequencefile  --target-dir /user/root/output/hirw/sqoopimport/stocks_seq --delete-target-dir

Hive Table Creation

CREATE TABLE stocks_sqoop_seq (id int, symbol string, name string, trade_date date, close_price float, volume int, update_time timestamp)  STORED AS SEQUENCEFILE LOCATION '/user/root/output/hirw/sqoopimport/stocks_seq';

When I try to now query the table it fails with exception

Failed with exception java.io.IOException:java.lang.RuntimeException: java.io.IOException: WritableName can't load class: stocks

Am I missing anything

Ash R
  • 1
  • 1

1 Answers1

0

You will have to declare input and output formats as well. Create the table like this:

CREATE TABLE stocks_sqoop_seq (
  id int, symbol string, 
  name string, trade_date date, 
  close_price float, volume int, 
  update_time timestamp)  
STORED AS SEQUENCEFILE 
STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.SequenceFileInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat'
LOCATION '/user/root/output/hirw/sqoopimport/stocks_seq'\;
Aman Mundra
  • 854
  • 12
  • 28