0

I have a while loop, that will run 300M times.

Each time this loop runs, it creates a 1 row table with two columns, which in the end of the loop this table is deleted and once the loop re-starts, this table is rebuilt again with 1 row with two columns but different data.

Currently I am using a table variable, but I would like to know your opinion on the type of table I should use to boost performance, given all this creation & erasing.

wr_lcb_ck
  • 85
  • 2
  • 9
  • 4
    Why not use just two simple variables? – Ruslan K. Sep 15 '17 at 10:42
  • 2
    While loops in sql server executing 300 million times, hmmmmm and you are concern about its performance hmmmmm, really :) – M.Ali Sep 15 '17 at 10:44
  • @M.Ali I am limited to few approaches in my case, this is one of them. From all my worse scenarios I have to find the less worse :x – wr_lcb_ck Sep 15 '17 at 10:47
  • 1
    You may well be missing better options. Why not ask a question about how to achieve your end goal and include the reasoning behind any constraints. – Martin Smith Sep 15 '17 at 11:38
  • @MartinSmith Yeah, maybe you a right.. Anyway, I have 4 column, id| idguid | parentidguid | file path I need a final table with the 4 columns plus 27 other columns that represent the maximum levels of depth given the 4 Million rows. 1st approach - CTE 2nd approach - row by row splitting path into the other columns 3rd - using while logic to perform same logic as before 4th - 27 self joins Anyway, the self joins will probably win, but I wanted to cover possibilities – wr_lcb_ck Sep 15 '17 at 13:19
  • This literally is the worst way to go about whatever you are trying to do. 300M loops? I'm surprised the query didn't time out. You just asked an XY Problem. Edit your question and provide some clearer details. http://xyproblem.info/ – S3S Sep 15 '17 at 14:41

1 Answers1

0

In your case, the key would be to use a same table variable, since it is processed in memory. A temporary table would be much slower.

Thks.

galloleonardo
  • 144
  • 10