3

Say, I have the following SQL Server 2008 table with data:

CREATE TABLE tbl (dtIn DATETIME2, dtOut DATETIME2)
INSERT tbl VALUES
('9/10/2012 5:14:10 AM', '9/10/2012 5:15:09 AM'),
('9/10/2012 5:16:12 AM', '9/10/2012 5:18:12 AM'),
('9/10/2012 5:18:43 AM', '9/10/2012 5:23:04 AM'),
('9/10/2012 5:25:17 AM', '9/10/2012 5:26:05 AM'),
('9/10/2012 5:26:57 AM', '9/10/2012 5:29:19 AM'),
('9/10/2012 5:31:41 AM', '9/10/2012 5:32:41 AM'),
('9/10/2012 5:33:16 AM', '9/10/2012 5:34:08 AM'),
('9/10/2012 5:35:25 AM', '9/10/2012 5:49:46 AM'),
('9/10/2012 5:55:35 AM', '9/10/2012 5:56:48 AM'),
('9/10/2012 5:58:54 AM', '9/10/2012 5:59:59 AM')

and then I ran this query:

WITH ctx AS(
  SELECT datediff(minute, dtIn, dtOut) AS d FROM tbl
  )
SELECT SUM(d) FROM ctx

I get 30 minutes.

But when I try the same with C#:

double fM = 0;
fM += (DateTime.Parse("9/10/2012 5:15:09 AM") - DateTime.Parse("9/10/2012 5:14:10 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:18:12 AM") - DateTime.Parse("9/10/2012 5:16:12 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:23:04 AM") - DateTime.Parse("9/10/2012 5:18:43 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:26:05 AM") - DateTime.Parse("9/10/2012 5:25:17 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:29:19 AM") - DateTime.Parse("9/10/2012 5:26:57 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:32:41 AM") - DateTime.Parse("9/10/2012 5:31:41 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:34:08 AM") - DateTime.Parse("9/10/2012 5:33:16 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:49:46 AM") - DateTime.Parse("9/10/2012 5:35:25 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:56:48 AM") - DateTime.Parse("9/10/2012 5:55:35 AM")).TotalMinutes;
fM += (DateTime.Parse("9/10/2012 5:59:59 AM") - DateTime.Parse("9/10/2012 5:58:54 AM")).TotalMinutes;

I get fM = 29.016666666666669.

By adding Math.Round() to each C# statement, I get 28.0. By adding Math.Floor() I get 25.0. By adding Math.Ceiling I get 33.0.

Can someone explain this discrepancy?

c00000fd
  • 20,994
  • 29
  • 177
  • 400

1 Answers1

3

The return value of each is measuring different things.

It would be prudent to pay attention to the DATEDIFF docs here:

Returns the count (signed integer) of the specified datepart boundaries crossed between the specified startdate and enddate.

which leads to the following 2 second interval:

SELECT datediff(minute, '9/10/2012 5:14:59 AM', '9/10/2012 5:15:01 AM') 

returning 1 because it crosses a minute boundary. I suspect that you did not take this behaviour into account.

spender
  • 117,338
  • 33
  • 229
  • 351
  • Thanks. But that sounds weird. Wouldn't rounding produce a smaller value in t-SQL? I tried adding Math.Round() to c# code and got 28 instead of 30. – c00000fd Sep 30 '12 at 00:35
  • @user843732 I won't answer this until you ensure that the two calculations are the same. I can see discrepancies between the SQL and the c# on the last line of each. Are there more? – spender Sep 30 '12 at 00:42
  • Thanks, I see now. It does its own rounding there. Hmm. That's an interesting way to program it... Just from curiosity, why did they make it count as such? Is it some sort of efficiency/speed thing? – c00000fd Sep 30 '12 at 00:54
  • Who knows? I just found this out myself. I suppose if you used `nanosecond` instead of `minute` then divided by 60*10^9 , you'd minimise the error. I imagine it's defined such to ensure that 23:59:59 and 00:00:01 the next day are shown to be a day apart. To that end it makes more sense. – spender Sep 30 '12 at 00:59
  • Yeah, I know that. Although that approach is rife with danger. The datediff uses 32-bit int internally, so in case of nanosecond or even second, it is very easy to cause an arithmetic overflow exception. I went through this and learned it the hard way. (That is why I decided to use minutes.) – c00000fd Sep 30 '12 at 01:03
  • Hah! Effectively that means that `nanosecond` overflows in 2 seconds. That's silly. – spender Sep 30 '12 at 01:05
  • Yeah, if you replace minute with nanosecond in my SQL it will produce an overflow. So it is effectively useless. Moreover second is also very dangerous. I think it gives about 65 years of span. Which is also quite low. – c00000fd Sep 30 '12 at 01:09
  • I think you should open a new question asking how one might do this to a higher accuracy without overflowing. I'm stumped. – spender Sep 30 '12 at 01:11
  • Well, you technically answered my original question. I guess I can live with this "small" loss of precision. – c00000fd Sep 30 '12 at 01:12