I am reading in time stamps as strings from a service that is UNIX time formatted in Nano seconds. This introduces an obvious problem in that I can't conduct standard operations to normalize the strings to seconds given how large they are. An example of one of these strings is '1589212802642680000'
or 1.58921E+18 in scientific notation.
I was trying something like this: convert_fills_df['timeStamp'] = convert_fills_df.timeStamp.apply(lambda x: UNIX_EPOCH + (float(x)/1000000000))
. But I overflow the float object when I try this; is there a string operation I can do without losing precision down to the second? Nanoseconds for my purpose are not necessary (though I appreciate their thoroughness). If I could keep the nanoseconds that's great too, but it is not a necessity.
I would like to just convert the time to a human readable format in 24 hour clock format.