2

I'm running some stored procedures in SQL Server 2012 under Windows Server 2012 in a dedicated server with 32 GB of RAM and 8 CPU cores. The CPU usage is always below 10% and the RAM usage is at 80% because SQL Server has 20 GB (of 32 GB) assigned.

There are some stored procedures that are taking 4 hours some days and other days, with almost the same data, are taking 7 or 8 hours.

I'm using the least restrictive isolation level so I think this should not be a locking problem. The database size is around 100 GB and the biggest table has around 5 million records.

The processes have bulk inserts, updates and deletes (in some cases I can use truncate to avoid generating logs and save some time). I'm making some full-text-search queries in one table.

I have full control of the server so I can change any configuration parameter.

I have a few questions:

  1. Is it possible to improve the performance of the queries using parallelism?
  2. Why is the CPU usage so low?
  3. What are the best practises for configuring SQL Server?
  4. What are the best free tools for auditing the server? I tried one from Microsoft called SQL Server 2012 BPA but the report is always empty with no warnings.

EDIT: I checked the log and I found this:

03/18/2015 11:09:25,spid26s,Unknown,SQL Server has encountered 82 occurrence(s) of I/O requests taking longer than 15 seconds to complete on file [C:\Program Files\Microsoft SQL Server\MSSQL11.HLSQLSERVER\MSSQL\DATA\templog.ldf] in database [tempdb] (2). The OS file handle is 0x0000000000000BF8. The offset of the latest long I/O is: 0x00000001fe4000

Francisco Goldenstein
  • 13,299
  • 7
  • 58
  • 74
  • 1
    Your problem is complex and there can be many reasons for your performance problem.. The CPU usage is so low probably because the Server is swaping on disk because the memory is full. (It's my guess) – Danielle Paquette-Harvey Mar 18 '15 at 12:48
  • You should start by seeing if you can improve the queries. Look at the execution plan. You can use SQL Server Profiler to see exactly what's going on on the server. You can even add the execution plan to profiler so you can know wich queries are too slow. – Danielle Paquette-Harvey Mar 18 '15 at 12:49
  • You can also use SQL Sentry Plan Explorer (free) to investigate the execution plan and post it here (anonymize it if necessary): http://www.sqlsentry.com/products/plan-explorer/sql-server-query-view#download – NickyvV Mar 18 '15 at 12:54
  • Apart from the execution plan,check whether the tables have indexes? Rebuild the indexes – nobody Mar 18 '15 at 14:43
  • You have given 20 G to SQL Server and 12 to OS. Is it possible for you to increase 20 G to 25 or may be 26 G. Again question is open ended and we would need some execution plan or waits stats when query is performing bad to comment further. – Shanky Mar 18 '15 at 14:52
  • Unfortunately I cannot post any queries or execution plans because of my company's privacy terms. I was hoping that someone could help me with auditing the SQL server configuration or explaining to me why the CPU usage is so low. – Francisco Goldenstein Mar 18 '15 at 15:23

2 Answers2

5
  1. Bump up max memory to 24 gb.
  2. Move tempdb off the c drive and consider mult tempdb files, with auto grow at least 128 Mbps or 256 Mbps.
  3. Install performance dashboard and run performance dashboard report to see what queries are running and check waits.
  4. If you are using auto grow on user data log and log files of 10%, change that to something similar to tempdb growth above.
  5. Using performance dashboard check for obvious missing indexes that predict 95% or higher improvement impact.
  6. Disregard all the nay Sayers who say not to do what I'm suggesting. If you do these 5 things and you're still having trouble post some of the results from performance dashboard, which by the way is free.
  7. One more thing that may be helpful, download and install the sp_whoisactive stored proc, run it and see what processes are running. Research the queries that you find after running sp_whoisactive.
DForck42
  • 19,789
  • 13
  • 59
  • 84
DBA TED
  • 111
  • 1
  • 2
4

query taking hours but using low CPU

You say that as if CPU would matter for most db operations. HINT: They do not.

Databases need IO. RAM sin some cases helps mitigate this, but at the end it runs down to IO.

And you know what I see in your question? CPU, Memory (somehow assuming 32gb is impressive) but NO WORD ON DISC LAYOUT.

And that is what matters. Discs, distribution of files to spread the load.

If you look into performance counters then you will see latency being super high on discs - because whatever "pathetic" (in sql server terms) disc layout you have there, it simply is not up to the task.

Time to start buying. SSD are a LOT cheaper than discs. You may say "Oh, how are they cheaper". Well, you do not buy GB - you buy IO. And last time I checked SSD did not cost 100 times the price of discs - but they have 100 times or more the IO. and we talk always of random IO.

Then isolate Tempdb on separate SSD - tempdb either does no a lot or a TON and you want to see this.

Then isolate the log file.

Make multiple data files, for database and tempdb (particularly tempdb - as many as you have cores).

And yes, this will cost money. But at the end - you need IO and like most developers you got CPU. Bad for a database.

TomTom
  • 61,059
  • 10
  • 88
  • 148
  • +1 for your comment. It's been a long time since I had this problem and I solved it the way you described. Faster IO, spreaded data files, log files and tempdb. Also query tuning helped a lot. – Francisco Goldenstein Apr 13 '16 at 14:48