2

The unix timestamp value is coming to HANA via data services from MySQL DB.

I can however find the date using the following statement but wanted to know if there are any built in functions in SAP HANA or in Data Services Designer which can do this job? Also is there any way to get time from the result like we have TO_DATE for date other that RIGHT( )?

SELECT TO_VARCHAR (ADD_SECONDS( '1970-01-01 00:00:00', "1452261405"))

and it will return 2016-01-08 13:56:45

Hashim
  • 23
  • 1
  • 1
  • 5

2 Answers2

5

You can use HANA built in functions TO_DATE and TO_TIME:

SELECT TO_TIMESTAMP (ADD_SECONDS( '1970-01-01 00:00:00', '1452261405')) TIMESTAMP,
       TO_DATE (ADD_SECONDS( '1970-01-01 00:00:00', '1452261405')) DATE,
       TO_TIME (ADD_SECONDS( '1970-01-01 00:00:00', '1452261405')) TIME
from PUBLIC.DUMMY

Gives you:

Fri Jan 08 2016 13:56:45 GMT+0000 (UTC) 08.01.2016 13:56:45

Christoph G
  • 555
  • 2
  • 8
0

This sort of conversion really lends itself to be implemented in a user defined function (UDF). Something like this could make your code more easy to read:

FUNCTION "DEVDUDE"."sandpit::epoch_to_seconddate" (IN epoc_date integer ) 
    RETURNS SQL_SECONDDATE seconddate
    LANGUAGE SQLSCRIPT
    SQL SECURITY INVOKER AS
BEGIN

    SQL_SECONDDATE = ADD_SECONDS( to_seconddate('1970-01-01 00:00:00'), :epoc_date);

END;

Then you can simply use it as if it was a built-in function:

 select "sandpit::epoch_to_seconddate"(-1452261405) from dummy;

SQL_SECONDDATE
1923-12-25 10:03:15.0

Lars Br.
  • 9,949
  • 2
  • 15
  • 29
  • Thanks Lars; it does make sense to go for a function for better readability. Does this have any implication on performance when there are millions of records to be processed especially when comparing with the approach @Christoph has mentioned? – Hashim Mar 22 '17 at 09:53
  • Agreed, for readability it is really good to create and use an scalar UDF for SQL-expressions you need more than once. Just checked the performance in a HCP trial account SPS12. For 1 million rows, selecting a to_date(...) and a to_time(...) column out of an integer takes 100-150 millisec using an UDF inbetween, 50-75 millisec using only the built-in functions. Without testing my guess would have been that performance would be roughly the same because HANA "flattens" out the UDF-call somehow, but you never know... I would still go for the UDF approach if performance is good enough. – Christoph G Mar 22 '17 at 16:18
  • Christoph correctly hints towards testing here! If raw performance is absolutely paramount, then there is no way to get around testing and measuring alternative solutions. But for any non-trivial solution build, eventually, maintainability becomes the far bigger burden. And finding bottlenecks and possible improvements in a well structured/modularized code base is a lot easier than trying to "decode" the execution of a huge single statement. – Lars Br. Mar 22 '17 at 20:59