I have a dataset like the below:
epoch_seconds | eq_time |
---|---|
1636663343887 | 2021-11-12 02:12:23 |
Now, I am trying to convert the eq_time
to epoch
seconds which should match the value of the first column but am unable to do so. Below is my code:
df = spark.sql("select '1636663343887' as epoch_seconds")
df1 = df.withColumn("eq_time", from_unixtime(col("epoch_seconds") / 1000))
df2 = df1.withColumn("epoch_sec", unix_timestamp(df1.eq_time))
df2.show(truncate=False)
I am getting output like below:
epoch_seconds | eq_time | epoch_sec |
---|---|---|
1636663343887 | 2021-11-12 02:12:23 | 1636663343 |
I tried this link as well but didn't help. My expected
output is that the first and third columns should match each other.
P.S: I am using the Spark 3.1.1
version on local whereas it is Spark 2.4.3
in production, and my end goal would be to run it in production.