5

I am trying to create dynamic partitions in hive using following code.

SET hive.exec.dynamic.partition = true;
SET hive.exec.dynamic.partition.mode = nonstrict;

create external table if not exists report_ipsummary_hourwise(
ip_address string,imp_date string,imp_hour bigint,geo_country string)
PARTITIONED BY (imp_date_P string,imp_hour_P string,geo_coutry_P string) 
row format delimited 
fields terminated by '\t'
stored as textfile
location 's3://abc';

insert overwrite table report_ipsummary_hourwise PARTITION (imp_date_P,imp_hour_P,geo_country_P)
SELECT ip_address,imp_date,imp_hour,geo_country,
       imp_date as imp_date_P,
       imp_hour as imp_hour_P,
       geo_country as geo_country_P
FROM report_ipsummary_hourwise_Temp;

Where report_ipsummary_hourwise_Temp table contains following columns, ip_address,imp_date,imp_hour,geo_country.

I am getting this error

SemanticException Partition spec {imp_hour_p=null, imp_date_p=null, geo_country_p=null} contains non-partition columns.

Can anybody suggest why this error is coming ?

ArK
  • 20,698
  • 67
  • 109
  • 136
rupali
  • 165
  • 1
  • 3
  • 11

5 Answers5

4

You insert sql have the geo_country_P column but the target table column name is geo_coutry_P. miss a n in country

amow
  • 2,203
  • 11
  • 19
  • 1
    this error is something else. even after this name correction same error is coming – rupali Apr 16 '15 at 06:54
  • 7
    I encountered a similar issue and the issue was a typo in my `INSERT`'s `PARTITION` clause. The columns listed in the `PARTITION` clause *must* match the names defined in the table create. – Mr. Llama Jan 07 '16 at 21:23
  • I was going to downvote but then my arrogant inner self told me to just do a spell check and I caught the error :) .. – Ali May 03 '17 at 03:55
1

I was facing the same error. It's because of the extra characters present in the file. Best solution is to remove all the blank characters and reinsert if you want.

Kishor m n
  • 45
  • 1
  • 7
1

It could also be https://issues.apache.org/jira/browse/HIVE-14032

INSERT OVERWRITE command failed with case sensitive partition key names

There is a bug in Hive which makes partition column names case-sensitive.

For me fix was that both column name has to be lower-case in the table and PARTITION BY clause's in table definition has to be lower-case. (they can be both upper-case too; due to this Hive bug HIVE-14032 the case just has to match)

Tagar
  • 13,911
  • 6
  • 95
  • 110
0

It says while copying the file from result to hdfs jobs could not recognize the partition location. What i can suspect you have table with partition (imp_date_P,imp_hour_P,geo_country_P) whereas job is trying to copy on imp_hour_p=null, imp_date_p=null, geo_country_p=null which doesn't match..try to check hdfs location...the other point what i can suggest not to duplicate column name and partition twice

Nitin
  • 3,533
  • 2
  • 26
  • 36
0

insert overwrite table report_ipsummary_hourwise PARTITION (imp_date_P,imp_hour_P,geo_country_P) SELECT ip_address,imp_date,imp_hour,geo_country, imp_date as imp_date_P, imp_hour as imp_hour_P, geo_country as geo_country_P FROM report_ipsummary_hourwise_Temp;

The highlighted fields should be the same name available in the report_ipsummary_hourwise file

Surya
  • 1
  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Luciana Oliveira Dec 21 '22 at 21:05