In my hive scripts if I want to extract the year from timestamp I used this:
year(from_unixtime(cast(payload_fecha/1000 as BIGINT),'yyyy-MM-dd HH:mm:ss.SSS' )) as year
now I testing the new DAS snapshot and I want to do the same but I cannot use from_unixtime. So how can I do the same in spark SQL?
Take into account that I use WSO2 DAS, so I need a solution that work with this tool, not a generic solution for another environment.