Apache Spark 2.4.4 + io.delta:delta-core_2.12:0.5.0.
I have created a fully qualified Delta Table using the below command.
CREATE TABLE DB.table_name
USING DELTA
LOCATION '/tmp/delta/table_path'
data_df.write.format("delta").partitionBy('Year','Month') \
.save(table_path)
when I execute Spark SQL: SHOW PARTITIONS it failed saying the table is not partitioned I have tried via Spark SQL: DESCRIBE TABLE EXTENDED/FORMATTED, but found no information on the partition. Is there a more direct approach to get info on partitions from the delta table (version 0.5.0)?
From the delta document, there is a way to get partition info from delta_log/ programmatically but this feature is not supported in the lower version of delta.
deltaTable = DeltaTable.forPath(spark, pathToTable)
detailDF = deltaTable.detail()