I am trying to get ISO year week in Spark with scala from a date which is in string format.
The following SQL query returns the expected result in hive.
i.e if the date is 1st January 2016, as per ISO standard it is 53rd week of year 2015 and hence the result 201553.
hive> select from_unixtime(unix_timestamp('20160101', 'yyyyMMdd'), 'Yww');
OK
201553
Time taken: 0.444 seconds, Fetched: 1 row(s)
If I try to run the same in Spark via Spark sql, it is giving me a different result.
scala> spark.sql("""select from_unixtime(unix_timestamp('20160101', 'yyyyMMdd'), 'Yww')""").show
+------------------------------------------------------+
|from_unixtime(unix_timestamp(20160101, yyyyMMdd), Yww)|
+------------------------------------------------------+
| 201601|
+------------------------------------------------------+
The result I need from Spark program in 201553.
I am using Spark version 2.3
Can someone explain what's going on?
Please let me know if there is any way to get ISO year week in Spark.