0

Apache Spark 2.4.4 + io.delta:delta-core_2.12:0.5.0.

I have created a fully qualified Delta Table using the below command.

CREATE TABLE DB.table_name
USING DELTA
LOCATION '/tmp/delta/table_path'

data_df.write.format("delta").partitionBy('Year','Month') \
                .save(table_path)

when I execute Spark SQL: SHOW PARTITIONS it failed saying the table is not partitioned I have tried via Spark SQL: DESCRIBE TABLE EXTENDED/FORMATTED, but found no information on the partition. Is there a more direct approach to get info on partitions from the delta table (version 0.5.0)?

From the delta document, there is a way to get partition info from delta_log/ programmatically but this feature is not supported in the lower version of delta.

 deltaTable = DeltaTable.forPath(spark, pathToTable)
        detailDF = deltaTable.detail()
Spandana
  • 13
  • 1
  • 1
  • 5
  • 1
    You can query the file system and/or parse the transaction log to obtain this information for earlier versions of Delta. Note, `SHOW PARTITIONS` has both an open issue [996](https://github.com/delta-io/delta/issues/996) and WIP PR [1051](https://github.com/delta-io/delta/pull/1051) – Denny Lee Dec 01 '22 at 15:30

0 Answers0