at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:563)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:83)
... 17 more
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.ql.io.orc.OrcStruct cannot be cast to org.apache.hadoop.io.BinaryComparable
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.doDeserialize(LazySimpleSerDe.java:166)
at org.apache.hadoop.hive.serde2.AbstractEncodingAwareSerDe.deserialize(AbstractEncodingAwareSerDe.java:71)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.readRow(MapOperator.java:149)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.access$200(MapOperator.java:113)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:554)
... 18 more
2019-09-19 11:50:29,860 [INFO] [TezChild] |task.TezTaskRunner|: Encounted an error while executing task: attempt_1568591126479_21189_1_01_000000_2
java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable {140725, 222117, 11A2, YYYYYYNN , F, R, SeLect Advntg RX OB, N, MATERNITY , I, 0.00, 04, N, N, Y, Y, Y, N, 003, A, B, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , P, N, S, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , 00, 0P7001, N, SB2, SeLectBL ADvntg RX OB , MATERNITY , 20100101, Y, N, N, , , N, 99, N, 00, Y, 12, N, 0.00, 501, , , , , , , , , , , , , , , , , , , , , , , , , , , 020, , , , , , , , , , , , , 01, 02, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Y, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Y, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , 20130715, , , , , 001, , , 0.00, N, N, 99, , 00, , 20100101, , I, 900, 900, 900, DOC.00000000.PRIM, 00, 000, 000, , 000, 000, 000, 0101, 0104, 0204, , , , , , , , , , , , 20100101, 11U2, , 00000000, 00000000, DOC.00000000.PRIM, , , , }
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:347)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:194)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:185)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:185)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:181)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable {140725, 222117, 11A2, YYYYYYNN , F, R, SeLect Advntg RX OB, N, MATERNITY , I, 0.00, 04, N, N, Y, Y, Y, N, 003, A, B, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , P, N, S, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , 00, 0P7001, N, SB2, SeLectBL ADvntg RX OB , MATERNITY , 20100101, Y, N, N, , , N, 99, N, 00, Y, 12, N, 0.00, 501, , , , , , , , , , , , , , , , , , , , , , , , , , , 020, , , , , , , , , , , , , 01, 02, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Y, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Y, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , 20130715, , , , , 001, , , 0.00, N, N, 99, , 00, , 20100101, , I, 900, 900, 900, DOC.00000000.PRIM, 00, 000, 000, , 000, 000, 000, 0101, 0104, 0204, , , , , , , , , , , , 20100101, 11U2, , 00000000, 00000000, DOC.00000000.PRIM, , , , }
at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:91)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:68)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:325)
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:150)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable {140725, 222117, 11A2, YYYYYYNN , F, R, SeLect Advntg RX OB, N, MATERNITY , I, 0.00, 04, N, N, Y, Y, Y, N, 003, A, B, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , P, N, S, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , 00, 0P7001, N, SB2, SeLectBL ADvntg RX OB , MATERNITY , 20100101, Y, N, N, , , N, 99, N, 00, Y, 12, N, 0.00, 501, , , , , , , , , , , , , , , , , , , , , , , , , , , 020, , , , , , , , , , , , , 01, 02, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Y, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Y, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , 20130715, , , , , 001, , , 0.00, N, N, 99, , 00, , 20100101, , I, 900, 900, 900, DOC.00000000.PRIM, 00, 000, 000, , 000, 000, 000, 0101, 0104, 0204, , , , , , , , , , , , 20100101, 11U2, , 00000000, 00000000, DOC.00000000.PRIM, , , , }
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:563)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:83)
... 17 more
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.ql.io.orc.OrcStruct cannot be cast to org.apache.hadoop.io.BinaryComparable
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.doDeserialize(LazySimpleSerDe.java:166)
at org.apache.hadoop.hive.serde2.AbstractEncodingAwareSerDe.deserialize(AbstractEncodingAwareSerDe.java:71)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.readRow(MapOperator.java:149)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.access$200(MapOperator.java:113)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:554)
... 18 more
2019-09-19 11:50:29,872 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Final Counters for attempt_1568591126479_21189_1_01_000000_2: Counters: 33 [[File System Counters HDFS_BYTES_READ=38594, HDFS_READ_OPS=2, HDFS_OP_OPEN=2][org.apache.tez.common.counters.TaskCounter SPILLED_RECORDS=0, GC_TIME_MILLIS=116, CPU_MILLISECONDS=6310, PHYSICAL_MEMORY_BYTES=3571974144, VIRTUAL_MEMORY_BYTES=10842828800, COMMITTED_HEAP_BYTES=3571974144, INPUT_RECORDS_PROCESSED=1, INPUT_SPLIT_LENGTH_BYTES=44890, OUTPUT_RECORDS=0, OUTPUT_BYTES=0, OUTPUT_BYTES_WITH_OVERHEAD=0, OUTPUT_BYTES_PHYSICAL=0, ADDITIONAL_SPILLS_BYTES_WRITTEN=0, ADDITIONAL_SPILLS_BYTES_READ=0, ADDITIONAL_SPILL_COUNT=0, SHUFFLE_CHUNK_COUNT=0][HIVE DESERIALIZE_ERRORS=1, RECORDS_IN_Map_1=0, RECORDS_OUT_INTERMEDIATE_Map_1=0][TaskCounter_Map_1_INPUT_fmr_disk_file INPUT_RECORDS_PROCESSED=1, INPUT_SPLIT_LENGTH_BYTES=44890][TaskCounter_Map_1_OUTPUT_Reducer_2 ADDITIONAL_SPILLS_BYTES_READ=0, ADDITIONAL_SPILLS_BYTES_WRITTEN=0, ADDITIONAL_SPILL_COUNT=0, OUTPUT_BYTES=0, OUTPUT_BYTES_PHYSICAL=0, OUTPUT_BYTES_WITH_OVERHEAD=0, OUTPUT_RECORDS=0, SHUFFLE_CHUNK_COUNT=0, SPILLED_RECORDS=0]]
2019-09-19 11:50:29,872 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Joining on EventRouter
2019-09-19 11:50:29,872 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Closed processor for vertex=Map 1, index=1
2019-09-19 11:50:29,873 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Closed input for vertex=Map 1, sourceVertex=fmr_disk_file
2019-09-19 11:50:29,873 [INFO] [TezChild] |impl.PipelinedSorter|: Reducer 2: Starting flush of map output
2019-09-19 11:50:29,873 [INFO] [TezChild] |impl.PipelinedSorter|: Reducer 2: done sorting span=0, length=0, time=0

- 179,855
- 19
- 132
- 245

- 11
- 1
- 2
-
Internal Table : table prprts : ORC.compress=snappy, transactional =true : orcserde, stored as orc tbl : bucketed table : serialization =1 – ravi R Sep 20 '19 at 20:55
2 Answers
This is the hive table exception, when we create a table in the hive during migration we simply copy the ddl of the table from the source to target. When we copy the ddl structure from source we need to remove "STORED AS INPUTFORMAT" and "OUTPUTFORMAT" which will appear as below.
STORED AS INPUTFORMAT
org.apache.hadoop.hive.ql.io.orc.OrcInputFormat
OUTPUTFORMAT
org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat
By removing above lines, we can replace it with below
STORED AS ORC;
For me issue has resolved for the same.

- 5,631
- 7
- 30
- 51

- 81
- 2
I had similar issue, interestingly, in HIVE cli I was able to query the table but not through spark-sql.
I observed something different in the table creation statement where it specified FIELDS TERMINATED BY option , when we are using ORC format then no need to specify any such options.
I recreated the table by removing it and it worked.

- 174
- 2
- 14