0

I have to create a temp table #Trades from a select query. The select query returns 77,987 rows and have about 175 columns (mix of varchar, numeric, small int and int).

I am current using the following query to create the temp table:

SELECT Col1, Col2 ... Col175 into #Trades From Table

enter image description here

As per the query plan, the Table Insert takes the most amount of time.

enter image description here

Also, Estimated Rows count does not match with Actual Rows.

Is there anyway I can improve the performance of the query used create #Trades temp table?

PS: I tried creating a temp table using create table and changed the query to insert into the temp table but it did not help much.

developer
  • 1,401
  • 4
  • 28
  • 73
  • 2
    The only way to improve the speed of the actual `INSERT` itself is hardware. When you're inserting data, you're writing it to disc; if you have a poor performing disc, then you will have a poor performing `INSERT`. That image, however, dosen't actually mean anything; of course the `INSERT` is going to be the one of the most costly things. Again, it's writing to disc; that's not a fast process. I'm not really sure what else you were expecting here – Thom A Oct 30 '18 at 11:59
  • i see, thanks for the quick reply. Would it improve the performance if I create an actual table (staging table) instead of a temp table? – developer Oct 30 '18 at 12:02
  • 1
    Usually that works, but not in your case. If your tempdb drive is faster than "data" drive then it will be much slower. Try to look at waits during the insert. They might tell you about your system's bottleneck. – Slava Murygin Oct 30 '18 at 12:09
  • have you searched google? how about this? https://learn.microsoft.com/en-us/sql/relational-databases/in-memory-oltp/faster-temp-table-and-table-variable-by-using-memory-optimization?view=sql-server-2017 – Cato Oct 30 '18 at 13:00
  • 1
    As already mentioned, check the waits while it is running. If they're I/O waits then you might need more RAM or faster disks – Nick.Mc Oct 30 '18 at 13:14
  • 3
    A temp table with nearly 80k rows and 175 columns screams of something gone wrong in the design phase. The number of rows is not really problematic but 175 columns... – Sean Lange Oct 30 '18 at 13:21
  • @SeanLange - it is a data warehouse system with a denormalized db structure hence it has 175 columns. But I agree to your point. – developer Oct 30 '18 at 13:26
  • What is the volume of your data in Gb? – sepupic Oct 30 '18 at 13:27
  • @SlavaMurygin, Nick - I am running it on dev server which is slower than prod. However, it sounds like prod should not have such issue (as it is faster than dev). Thanks. – developer Oct 30 '18 at 13:27
  • @Cato - unfortunately it is for sql server 2008 R2. – developer Oct 30 '18 at 13:28
  • @sepupic - estimated data size 179 MB, Actual data size - 650 KB – developer Oct 30 '18 at 13:30
  • 1
    2008R2 falls out of support on 7/9/2019. This might be a good time to talk to the boss about both a hardware and a software upgrade to both improve performance and also avoid any compliance issues. – Eric Brandt Oct 30 '18 at 14:01
  • 1
    Also, your row estimate problem _might_ be a stale statistics problem, so rebuilding stats _might_ give you a boost. – Eric Brandt Oct 30 '18 at 14:03
  • Ok, thanks @EricBrandt – developer Oct 30 '18 at 14:03
  • 1
    Tip: See [paste the plan](https://www.brentozar.com/pastetheplan/instructions/) for a better way to include an execution plan in your question. – HABO Oct 30 '18 at 14:09

0 Answers0