0

I'm using this public Postgres DB of NEAR protocol: https://github.com/near/near-indexer-for-explorer#shared-public-access

There is a field called included_in_block_timestamp whose "data type" = "numeric", and "length/precision" = 20.

This code works:

to_char(TO_TIMESTAMP("public"."receipts"."included_in_block_timestamp"/1000000000), 'YYYY-MM-DD HH:mm') as moment, 

and so does this:

function convertTimestampDecimalToDayjsMoment(timestampDecimal: Decimal) {
  const timestampNum = Number(timestampDecimal) / 1_000_000_000; // Why is this necessary?
  console.log({ timestampNum });
  const moment = dayjs.unix(timestampNum); // https://day.js.org/docs/en/parse/unix-timestamp
  return moment;
}

For example, sometimes included_in_block_timestamp = 1644261932960444221.

I've never seen a timestamp where I needed to divide by 1 billion. Figuring this out just now was a matter of trial and error.

What's going on here? Is this common practice? Does this level of precision even make sense?

Ryan
  • 22,332
  • 31
  • 176
  • 357
  • That's nano second resolution, the docs say it's only supposed to be 1us resolution. – Aaron Feb 07 '22 at 19:54
  • 1
    What *kind* of thing do you think might be going on? To convert nanoseconds to seconds, you divide by a billion. It is not mysticism, it is just arithmetic. – jjanes Feb 07 '22 at 19:56
  • 1
    1644261932960444221 is not a "timestamp" to begin with - it's just a number –  Feb 07 '22 at 20:07
  • `to_char(to_timestamp(b.block_timestamp / 1000000000), 'YYYY-MM-DD"T"HH24:MI:SS"Z"') block_timestamp_readable,` was also useful in SQL. – Ryan Nov 18 '22 at 20:09

1 Answers1

1

Timestamp units of measure in nanoseconds seems to be determined at the protocol-level as this appears in the docs here: https://docs.near.org/develop/contracts/environment/#environment-variables

and here: https://nomicon.io/RuntimeSpec/Components/BindingsSpec/ContextAPI

So yes, do take this into account before date-time conversions.

idea404
  • 399
  • 1
  • 2
  • 10