2

I have created external tables by following below steps

Hive >  ADD JAR /usr/lib/hive/lib/hive-serdes-1.0-SNAPSHOT.jar;
Hive > set hive.exec.compress.output=true;
Hive > set mapred.output.compress=true;
Hive> set mapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec;
Hive> set io.compression.codecs=org.apache.hadoop.io.compress.GzipCodec;

Hive >  CREATE EXTERNAL TABLE Json (id BIGINT,created_at STRING,source STRING,favorited BOOLEAN) ROW FORMAT SERDE  "com.cloudera.hive.serde.JSONSerDe"
LOCATION  /user/cloudera/ jsonGZ ";

I have compressed my Json file by executing below command

“ hadoop jar /usr/lib/hadoop-mapreduce/hadoop-streaming-2.6.0-cdh5.5.0.jar -Dmap.output.compress=true  -Dmap.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec -Dmapreduce.output.fileoutputformat.compress=true -Dmapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.GzipCodec -input /user/cloudera/json/ -output /user/cloudera/jsonGZ “

Then when I am running “ select * from json; “ I am getting the below error:

“OK Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.map.JsonMappingException: Can not deserialize instance of java.util.LinkedHashMap out of VALUE_NUMBER_INT token at “ 

And also I have created one more table using “org.apache.hive.hcatalog.data.JsonSerD”

Hive >  ADD JAR /usr/lib/hive-hactalog/share/ hactalog/ hive-hactalog-core.jar;

Hive > CREATE EXTERNAL TABLE Json 1(id BIGINT,created_at STRING,source STRING,favorited BOOLEAN) ROW FORMAT SERDE  "com.cloudera.hive.serde.JSONSerDe"
LOCATION  /user/cloudera/ jsonGZ ";

Then when I am running “select * from json1;“,I am getting the below error:

Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: java.io.IOException: Start token not found where expected" after using "org.apache.hive.hcatalog.core(hive-hcatalog-core-0.13.0.jar)"

Am I missing something? How can I resolve this errors.

user3378165
  • 6,546
  • 17
  • 62
  • 101
Sai
  • 1,075
  • 5
  • 31
  • 58
  • Please edit your question to include the precise characters used in the commands. In several cases, you have "curly quotes", missing quotes, spaces, and double quotes -- SQL strings must (typically) be enclosed in a single quote like this `'` not `"` or `“`. So it doesn't look like some of these queries could work at all. Then, work backwards by starting without compression. – Tom Harrison Jun 15 '16 at 03:00

1 Answers1

0

Just gzip your files and put them as is (*.gz) into the table location.

gzip filename

Sai
  • 1,075
  • 5
  • 31
  • 58