7

I am trying to execute a large sql script which contains about 1000000 simple UPDATEqueries.

The total size of this script file is about 100 MB.

When I run this script I'm getting an Out Of Memory exception.

When I split the file into chunks of 10 MB I can run each of them.

However, I for convenience I would like to have only one script I can run at once. Is there any statement I can introduce so SQL server releases allocated memory after running each query so I can execute this large script at once?

DoNotArrestMe
  • 1,285
  • 1
  • 9
  • 20
David
  • 625
  • 6
  • 14

3 Answers3

10

If you have not done so already, insert a GO every thousand statements or so. Otherwise the whole file will be one large batch. SQL Server calculates a single execution plan for a batch which may be pushing you to resource limits.

You might run into another type of resource problem if you run the whole file in the same transaction. The larger the transaction, the more diskspace your TX log file will need to complete the processing of your file.

Ralf
  • 6,735
  • 3
  • 16
  • 32
3

UPDATED: Open the window command prompt (run+cmd) and run the following:

sqlcmd -S YOURSQLSERVER\INSTANCENAME -i c:\your_script_file.sql -o c:\your_log_file.log
user2997826
  • 295
  • 1
  • 3
  • 11
  • 2
    This does not solve my issue. With this command I still run into the Out Of Memory issue. – David Jan 21 '14 at 10:36
2

I've noticed what appears to be a memory leak in SSMS 2012 that appears to be related to the results window. Try putting set nocount on at the top of your script to prevent the results pane from being filled up with "1 row(s) affected" messages. Alternatively, is there a way to write your update statements as one statement so you're not doing a constant back and forth from the server and the client?

Ben Thul
  • 31,080
  • 4
  • 45
  • 68
  • 1
    Thanks for your sharing. I tried this but this was not the issue for me. Anyway I will keep this in mind, when I run into similar issues. – David Jan 21 '14 at 10:38