2

I want to take aggregate of NetQuatity from table BillDetails where this aggregated value should non-zero. I wrote a query as given below. But feels like it is not optimized. Is there anyone can refract this one. Any attempt will be appreciated.

SELECT * FROM
(
    SELECT DISTINCT CustId, Name, SUM(NetQuantity) OVER(PARTITION BY CustId, Name) AS Quantity
        FROM BillDetails
) T WHERE Quantity <> 0
N K
  • 3,217
  • 5
  • 23
  • 38
  • Why don't you use simple aggregate query? `SELECT CustId, Name, SUM(NetQuantity) AS Quantity FROM BillDetails GROUP BY CustId, Name HAVING SUM(NetQuantity) <> 0` – Nikola Markovinović Jun 24 '13 at 09:55
  • Do you have any idea about the performance of this one? It will be more helpful and will be appreciated. – N K Jun 24 '13 at 10:02
  • Should be very fast __if__ you have index on `BillDetails (CustID, Name) include (NetQuantity)`. – Nikola Markovinović Jun 24 '13 at 10:03
  • to Nikola Markovinovic. Analytical functions like Sum() Over are much faster than simple aggregate functions using group by. – VahiD Jun 24 '13 at 10:17
  • @VahiD I don't see why it would _always_ be the case, and in this particular situation with index I've mentioned above simple group by outperforms window function by 9 times on my test table (5 million records, small number of distinct CustID values). Note distinct in above query an partition by the same columns - here is a lot of repeated work done. – Nikola Markovinović Jun 24 '13 at 10:42

1 Answers1

0

There is no way to optimize this query, it's the only way to getting these results. If you have appropriate indices on your table, there will be no performance issues. UPDATED for pre filtering zero value rows you can use this Query:

SELECT * FROM(
SELECT DISTINCT CustId, Name, SUM(NetQuantity) OVER(PARTITION BY CustId, Name) AS Quantity
    FROM BillDetails
    WHERE NetQuantity <> 0) T WHERE Quantity <> 0
VahiD
  • 1,014
  • 1
  • 14
  • 30
  • It's taking aggregate from whole table and filter the result with condition. Are you still saying that there is no performance degradation? – N K Jun 24 '13 at 09:03
  • If NetQuantity can store negative numbers so there is no way to filter rows which results in zero Sum(NetQuantity). except rows which have 0 as NetQuantity, that I think the zero value frequency is not that high to cause performance issue. – VahiD Jun 24 '13 at 09:16
  • In my case zero aggregate value frequency is more than 20%. Total record count is thousands. – N K Jun 24 '13 at 09:23
  • how much is your zero value rows frequency not aggregate? – VahiD Jun 24 '13 at 09:26
  • you can't do anything for zero aggregate values, but you can pre filter zero value rows for better performance, but first of all you have to declare appropriate indices on your table. I will edit my answer for pre filter query – VahiD Jun 24 '13 at 09:37
  • Sorry. It seems to be not much apt to my situation. Anyway thanks to your attempt. – N K Jun 24 '13 at 10:18
  • And for your information, please answer a question only after you got an answer. – N K Jun 24 '13 at 10:23