In a Palantir Foundry Code Workbook Spark SQL node (or in the Spark console in SQL mode), this works:
SELECT date_format('2021-01-01',"yyyy-MM")
2021-01
But executing a pattern asking for a quarter doesn't:
SELECT date_format('2021-01-01',"yyyy-Q")
java.lang.IllegalArgumentException: Illegal pattern character 'Q'
This is a legal pattern in spark 3.2.0
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
Datetime patterns - Spark 3.2.0 Documentation
Is there some environment configuration that changes this behavior? There is a switch set spark.sql.legacy.timeParserPolicy=LEGACY that perhaps could have to do with it. If this is the culprit - how to change this in the workbook environment?