0

I try to get the date range between the data changes in SQL Server my query is

select count(1) as qty, Info, convert(char,dFError,100) dErr
from TableData
group by Info, convert(char,dFError,100) 
order by dErr asc

I have this qty has the number of reques to a server, info are the servers ip and the date it's when a request it's sended to another server.

qty Info dErr
1 1.97 Aug 11 2021 9:01AM
1 1.97 Aug 11 2021 9:06AM
88 1.33 Dec 21 2021 2:04PM
1 1.95 Dec 22 2021 9:44PM
9 1.95 Dec 22 2021 9:45PM
1 1.33 Dec 22 2021 9:51PM
19 1.33 Dec 22 2021 9:52PM
3 1.33 Dec 22 2021 9:53PM
6 1.33 Dec 27 2021 7:10PM
17 1.33 Dec 27 2021 7:11PM
15 1.95 Dec 27 2021 7:17PM
8 1.95 Dec 27 2021 7:18PM

and I want this, in Aug 11 at 9:06AM all are going to 1.97, at Dec 21 at 2:04PM all are going to 1.33, that means the date and the info

qty Info dErr
2 1.97 Aug 11 2021 9:06AM
88 1.33 Dec 21 2021 2:04PM
10 1.95 Dec 22 2021 9:45PM
46 1.33 Dec 27 2021 7:11PM
23 1.95 Dec 27 2021 7:18PM

in the same day can be the same group of numbers on distinct hour

qty Info dErr
1 1.97 Jan 24 2022 9:39AM
1 1.97 Jan 24 2022 9:51AM
1 1.97 Jan 24 2022 9:58AM
4 1.97 Jan 24 2022 10:08AM
1 1.97 Jan 24 2022 10:12AM
8 1.95 Jan 24 2022 10:24AM
2 1.95 Jan 24 2022 10:32AM
10 1.33 Jan 24 2022 10:33AM
1 1.33 Jan 24 2022 11:37AM
8 1.95 Jan 24 2022 11:59AM
1 1.95 Jan 24 2022 12:00PM
2 1.95 Jan 24 2022 12:08PM

and need to be displayed like

qty Info dErr
8 1.97 Jan 24 2022 10:12AM
10 1.95 Jan 24 2022 10:32AM
11 1.33 Jan 24 2022 11:37AM
11 1.95 Jan 24 2022 12:08PM
LukStorms
  • 28,916
  • 5
  • 31
  • 45
Praxusa
  • 77
  • 8
  • 1
    Have you looked at `sum` , `max` and `group by`? – Stu Feb 08 '22 at 17:19
  • Please edit your question showing what you have tried so far. – FlexYourData Feb 08 '22 at 17:21
  • The query you provided uses columns not seen elsewhere in your "I want this". Please provide a query and data that relates to it, as well as a question and DDL. – HABO Feb 08 '22 at 18:51
  • Yea, I tried with sum, max, group by, but I can't get what I need, I add more information, and make corrections – Praxusa Feb 08 '22 at 20:06

3 Answers3

0
select 
    SUM(P_COUNT) as "COUNT", 
    P_DATA as "DATA", 
    MAX(FECHA) as "FECHA"
from 
    TABLEA
GROUP BY 
    P_DATA, CONVERT(DATE, FECHA)
ORDER BY "FECHA"
Ramin Faracov
  • 3,032
  • 1
  • 2
  • 8
  • This only give me the last date, but sometimes the values ​​are intertwined, like > 8 1.95 Jan 24 2022 10:24AM >2 1.95 Jan 24 2022 10:32AM >10 1.33 Jan 24 2022 10:33AM >1 1.33 Jan 24 2022 11:37AM >8 1.95 Jan 24 2022 11:59AM >1 1.95 Jan 24 2022 12:00PM – Praxusa Feb 08 '22 at 19:25
0

Your expected results don't match the given data - in the first set you have rows for 12/22 with both 1.33 and 1.95, but not included in your expected results.

It seems to me you want to either group by the date - or the date\hour. Here is an example of both:

Declare @testTable table (qty int, Info numeric(3,2), dErr datetime);
 Insert Into @testTable (qty, Info, dErr)
 Values ( 1, 1.97, 'Aug 11 2021 9:01AM')
      , ( 1, 1.97, 'Aug 11 2021 9:06AM')
      , (88, 1.33, 'Dec 21 2021 2:04PM')
      , ( 1, 1.95, 'Dec 22 2021 9:44PM')
      , ( 9, 1.95, 'Dec 22 2021 9:45PM')
      , ( 1, 1.33, 'Dec 22 2021 9:51PM')
      , (19, 1.33, 'Dec 22 2021 9:52PM')
      , ( 3, 1.33, 'Dec 22 2021 9:53PM')
      , ( 6, 1.33, 'Dec 27 2021 7:10PM')
      , (17, 1.33, 'Dec 27 2021 7:11PM')
      , (15, 1.95, 'Dec 27 2021 7:17PM')
      , ( 8, 1.95, 'Dec 27 2021 7:18PM')

      , ( 1, 1.97, 'Jan 24 2022 9:39AM')
      , ( 1, 1.97, 'Jan 24 2022 9:51AM')
      , ( 1, 1.97, 'Jan 24 2022 9:58AM')
      , ( 4, 1.97, 'Jan 24 2022 10:08AM')
      , ( 1, 1.97, 'Jan 24 2022 10:12AM')
      , ( 8, 1.95, 'Jan 24 2022 10:24AM')
      , ( 2, 1.95, 'Jan 24 2022 10:32AM')
      , (10, 1.33, 'Jan 24 2022 10:33AM')
      , ( 1, 1.33, 'Jan 24 2022 11:37AM')
      , ( 8, 1.95, 'Jan 24 2022 11:59AM')
      , ( 1, 1.95, 'Jan 24 2022 12:00PM')
      , ( 2, 1.95, 'Jan 24 2022 12:08PM');

 --==== Grouped by date
 Select total_qty = sum(tt.qty)
      , tt.Info
      , latest_date = max(tt.dErr)
   From @testTable tt
  Group By
        tt.Info
      , cast(tt.dErr As date)
  Order By
        cast(tt.dErr As date);

 --==== Grouped by date\hour
 Select total_qty = sum(tt.qty)
      , tt.Info
      , latest_date = max(tt.dErr)
   From @testTable tt
  Group By
        tt.Info
      , cast(tt.dErr As date)
      , datepart(Hour, tt.dErr)
  Order By
        cast(tt.dErr As date)
      , datepart(Hour, tt.dErr);
Jeff
  • 512
  • 2
  • 8
  • The numbers, 1.95, 1.97, 1.33 are 3 servers, the request are sended "randomly" to each one, the date are when the request it's sended to another server, I'm counting the numer of request to each server until it changes and have the interval of the time where it changed – Praxusa Feb 08 '22 at 23:32
0

A double row_number can be used to calculate a ranking.
Then the ranking can be used in the aggregation to solve this Gaps-And-Islands type of problem.

select sum(qty) as qty, Info, max(dFError) as dErr
from (
  select Info, dFError, qty
  , convert(date, dFError) as dErrorDate
  , Rnk = row_number() over (order by dFError)
  + row_number() over (partition by Info order by dFError desc)
  from TableData
) q
group by Info, Rnk
order by dErr;
qty Info dErr
2 1.97 2021-08-11 09:06:00.000
88 1.33 2021-12-21 14:04:00.000
10 1.95 2021-12-22 21:45:00.000
46 1.33 2021-12-27 19:11:00.000
23 1.95 2021-12-27 19:18:00.000
8 1.97 2022-01-24 10:12:00.000
10 1.95 2022-01-24 10:32:00.000
11 1.33 2022-01-24 11:37:00.000
11 1.95 2022-01-24 12:08:00.000

Demo on db<>fiddle here

LukStorms
  • 28,916
  • 5
  • 31
  • 45
  • great!! that is, I'm not good with row_number I'll try to read more about that – Praxusa Feb 09 '22 at 15:02
  • There are more useful window functions like that. And [here's an example](https://stackoverflow.com/a/70162999/4003419) of an alternative method to solve this type of problem. – LukStorms Feb 09 '22 at 15:18