1

I was trying to write the spark data frame into Hive ACID table using Hive Data warehouse Connector(HWC) but it's hanging and not showing up any error.

I am using spark 2.3 and hive 3.1 on HDP 3.x. Tis is happening only when i try to write the data on existing hive table. And also observed the exclusive lock on the same table.

scala> df.write.format(HIVE_WAREHOUSE_CONNECTOR).mode("Overwrite").option("database", "default").option("table", "MyTable").save()

log: It's hanging after stage 29:

19/10/11 19:28:29 WARN TaskSetManager: Stage 26 contains a task of very large size (474 KB). The maximum recommended task size is 100 KB.
[Stage 29:============>                                          (89 + 2) / 403]19/10/11 19:28:37 WARN TaskSetManager: Stage 29 contains a task of very large size (477 KB). The maximum recommended task size is 100 KB.```
Rajesh
  • 61
  • 2
  • 5

0 Answers0