If I write a simple method to return the milliseconds between epoch time and DateTime.UtcNow
, I get a proper answer. However, if I write a method to return the milliseconds between some arbitrary date and epoch time, the last three digits are always zero. 'Some arbitrary date' means that I pass in to the method the output of DateTime.Parse("arbitrary date string"). As near as I can make out, the DateTime object returned by .Parse is not returning all the significant digits.
Test method:
static void GetMillis()
{
DateTime dUtc = DateTime.UtcNow;
DateTime epoch = new DateTime(1970,1,1,0,0,0,DateTimeKind.Utc);
double utcmillis = (dUtc - epoch).TotalMilliseconds;
String timestamp = dUtc.ToString();
DateTime arbitrary = (DateTime.Parse(timestamp));
Console.WriteLine("Milliseconds between DateTime.UtcNow {0} \nand epoch time {1} are {2}", dUtc, epoch, utcmillis);
Console.WriteLine("Milliseconds between arbitrary date {0} \nand epoch time {1} are {2}", arbitrary, epoch, (arbitrary - epoch).TotalMilliseconds);
}
Output:
C:\src\vs\epochConverter\epochConverter\bin\Debug
{powem} [54] --> .\epochConverter.exe -debug
Milliseconds between DateTime.UtcNow 8/26/2012 11:12:31 PM
and epoch time 1/1/1970 12:00:00 AM are 1346022751385.8
Milliseconds between arbitrary date 8/26/2012 11:12:31 PM
and epoch time 1/1/1970 12:00:00 AM are 1346022751000
I don't know if I'm doing something grotesquely wrong or not understanding the math here. I've researched in MSDN and can't find anything relating to this difference. I really would like to be able to compute the millis as described -- is it possible?
Thanks.
mp