Rails supports DateTime
with resolution of nanoseconds, but I noticed that it depends on the machine the app is being run.
On my MacBook running Mojave, Time.zone.now.strftime("%N")
will always output a 9-digit number that ends in 000
, (e.g. "122981000"
). It means that on the Mac, the resolution is limited to the microseconds scale.
On a Linux, however, the same command returns a number with full nanoseconds resolution (e.g. "113578523"
).
My problem comes when I'm using rspec
and I need to compare some DateTime
values.
When I'm developing on my Mac, the tests pass flawlessly, but then when our CI (Travis) will run the same test, it fails like this:
expected: 2019-04-09 19:14:27.214939637 -0300
got: 2019-04-09 19:14:27.214939000 -0300
The issue here is that our database, Postgres, is limited to microseconds, just like my Mac, where it doesn't fail. I store the DateTime
in the DB, then read it back and compare to what I have in memory. The DB rounds to microseconds, and therefore the comparison fails.
Is it possible to force Rails to run using microseconds precision?
My intention is to not need to manually round or truncate the timestamps in every test.