spark.sql("select case when trim(map('OPT OUT',1,'OPT IN',0,'',0)[coalesce(upper(program_1),'')]) == trim(num_fg) then trim(from_unixtime(unix_timestamp(upd_dt,'MM/dd/yyyy HH:mm:ss.SSS'), 'yyyy-MM-dd HH:mm:ss.sss')) else Now() end as upd_dt from input").show(false)
Input
val sample = Seq(("OPT OUT","1","07/21/2020 09:09:09.382")).toDF("program_1","num_fg", "upd_dt")
In the above written query the micro seconds 'sss'
is not returning the input which we are giving.
If the input is 07/21/2020 09:09:09.382
it is returning 07/21/2020 09:09:09.009
but the expected result is 07/21/2020 09:09:09.382
[Whatever microsecond we are giving in input it should display in output].