I used Flume to fetch Twitter data. This data is stored as avsc file in hdfs. I created avro schema file "tweeter.avsc" and saved it in hdfs. But when I am trying to Create External Table using command below I get error.
Command:
CREATE EXTERNAL TABLE tweeter
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED as
INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION '/user/hive/warehouse/tweets2'
TBLPROPERTIES
('avro.schema.url'='http://localhost:50070/explorer.html#/user/cloudera/tweets.avsc');
But I get an Error:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException Encountered exception determining schema. Returning signal schema to indicate problem: org.codehaus.jackson.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null') at [Source: sun.net.www.protocol.http.HttpURLConnection$HttpInputStream@7851cf69; line: 1, column: 2])
Please help.