0

Environment:

Apache Hive (version 1.1.0-cdh5.14.2)

I tried creating a table with below DDL.

create external table test1 (v_src_code string,d_extraction_date date) partitioned by (d_mis_date date) row format serde 'org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe' with serdeproperties ("field.delim"="~|") stored as textfile location '/hdfs_path/test1' tblproperties("serialization.null.format"="");

Then I alter this table by adding one extra column as below.

 alter table test1 add columns(n_limit_id bigint);

This is working perfectly fine.

But recently our cluster got upgraded. The new environment is

Apache Hive (version 2.1.1-cdh6.3.4)

The same table is created in this new environment. When I do alter table I get below error.

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Error: type expected at the position 0 of '<derived from deserializer>:bigint' but '<' is found. (state=08S01,code=1)
dtolnay
  • 9,621
  • 5
  • 41
  • 62
  • I see similar questions asked earlier but no solution given. Please see the links. (https://stackoverflow.com/questions/40919610/add-column-to-hive-external-table-error) (https://stackoverflow.com/questions/44171086/how-do-i-add-columns-to-a-table-created-using-serde-when-creating-a-hive-table) (https://community.cloudera.com/t5/Support-Questions/Alter-external-hive-table-fails/td-p/184054) – Satya Nayak Jun 23 '22 at 09:25
  • Agree with the ^^ -- I think its a fallout from https://issues.apache.org/jira/browse/HIVE-11985 that was implemented in Hive 2.0. You can't alter table's schema anymore if that schema is based on/derived from a serde. A follow-up JIRA https://issues.apache.org/jira/browse/HIVE-17713 should make it clear instead of throwing an obscure error. – mazaneicha Jun 23 '22 at 13:14

0 Answers0