2

Could someone help guide me in what data type or format I need to submit from_unixtime for the spark from_unixtime() function to work?

When I try the following it works, but responds not with current_timestamp.

from_unixtime(current_timestamp())

The response is below:

fromunixtime(currenttimestamp(),yyyy-MM-dd HH:mm:ss)

When I try to input

from_unixtime(1392394861,"yyyy-MM-dd HH:mm:ss.SSSS")

The above simply fails with a type mismatch:

error: type mismatch; found : Int(1392394861) required: org.apache.spark.sql.Column from_unixtime(1392394861,"yyyy-MM-dd HH:mm:ss.SSSS")

What am I missing? I've tried a number of different things and tried reading documentation on using date/time in spark and every example I've tried fails with type mismatches.

Community
  • 1
  • 1
daniel
  • 21
  • 1
  • 1
  • 3
  • Daniel, welcome to StackOverflow! I liked your question, that's why I upvoted. Questions that are found interesting and useful are upvoted here, you will be able to do so when gain more reputation. Similarly, the answers can be upvoted too, but more importantly, the Original Poster (OP), i.e. you, can *accept* an answer posted under his question, to show the rest of the world, that yes, this helped me, it's a good answer indeed! – gsamaras Sep 01 '16 at 23:31

1 Answers1

3

Use lit() to create a column of literal value, like this:

from_unixtime(lit(1392394861), "yyyy-MM-dd HH:mm:ss.SSSS")

or, as zero323 mentioned:

from_unixtime(current_timestamp().cast("long")) 
gsamaras
  • 71,951
  • 46
  • 188
  • 305