2

I'm not sure whether this question belongs here or to StackOverflow. I'm attempting here, since my question is in regards of memory leaks and management stuff more than programming.

I have a SQL script which I attempt to run and everytime the server response is:

No sufficient memory to complete this query (that is the main idea, not the exact message)

Now, the script has more than 50 000 rows to insert like the example below:

1=>

insert into Cities ([Name]) values (N'MyCityName')

2=>

insert into Sectors ([Name], [Description], City_CityId)(
    select N'FirstSector', N'1at Sect. Desc.', c.CityId 
        from Cities c 
        where c.[Name] like N'MyCityName')

3=>

insert into Streets ([Name], Direction_Value, Type_Value, SectorId, City_CityId)(
    select N'1st Street', 0, 10, s.SectorId, c.CityId
        from Cities c
            inner join Sectors s on s.City_CityId = c.CityId
        where c.[Name] like N'MyCityName' 
            and s.[Name] like N'FirstSector')

4=>

insert into Addresses (StreetNumber, NumberSuffix_Value, UnitSuiteAppt, StreetId, SectorId, CityId)(
    select 999, 0, N'', st.StreetId, s.SectorId, c.CityId 
        from Cities c
            inner join Sectors s on s.City_CityId = c.CityId
            inner join Streets st on st.SectorId = s.SectorId and st.City_CityId = c.CityId
        where c.[Name] like N'MyCityName'
            and s.[Name] like N'FirstSector'
            and st.[Name] like N'1st Street')

5=>

insert into People (Surname, FirstName, IsActive, AddressId)(
    select N'TheSurname', N'TheFirstName', 1, a.AddressId
        from Addresses a
            inner join Cities c on c.CityId = a.CityId
            inner join Streets s on s.StreetId = a.StreetId
        where a.StreetNumber = 999
            and a.NumberSuffix_Value = 0
            and a.UnitSuiteAppt = N''
            and c.[Name] like N'MyCityName'
            and s.[Name] like N'1st Street')

So, I have the number of each instruction as follows:

1=> 2;

2=> 5;

3=> ~700;

4=> ~35000;

5=> ~35000;

Executing those thousands of instruction will drive me to the unsufficient memory issue. And while I open Task Manager, I have SSMS necessitating more than 400MB of RAM.

My configuration is as stated below:

Lenovo W700ds
2x320GB HDD 7200RPM RAID 0
4GB RAM DDR3
Intel Core 2 Quad 2.0GHz 6MB L2
Windows 7 Professional 64bits (/w all updates)
SQL Server 2005 Express services running 
    (That is my data server, I'm not using 2008 for this project)
SQL Server Management Studio 2008 Express 
    (SP3 installed /w all updates)

I only have SSMS2008 application running while executing the inserts instructions.

Any thoughts to render this situation doable either by system optimization or other updates are greatly appreciated.

Will Marcouiller
  • 256
  • 2
  • 5
  • 16

2 Answers2

2

You simply need to install more memory. 4GB is nothing on a modern system. Just because SSMS is the only program running that doesn't mean it's the only thing that is using memory (services are a big one). Also since you are running express edition you probably haven't configured the memory usage for SQL server, which by default tries to grab all the memory it can.

Zypher
  • 37,405
  • 5
  • 53
  • 95
  • So, if I understand correctly, this doesn't mean that SSMS has any memory leak, all it's required is to add some extra memory. These datum are to be delivered for tomorrow morning. What are my other options? – Will Marcouiller Dec 05 '11 at 02:51
  • Correct, it isn't a memory leak, you are just trying to use more memory than is available in the system. I don't know if you have any other options 50k rows is a pretty big dataset. – Zypher Dec 05 '11 at 03:08
  • Do you think it should succeed if I insert only a subset of my streets, addresses and people at a time? Let's say about 500 rows at once, and once completed, launch another 500 rows inserts? By chewing down the size of the dataset, should it do it according to your experience? That is what I'm wondering, and I'm a bit scared to try, since I have already lost more than 40 only this weekend trying to have my data into the database so that I can finally process the report I have to make with these. – Will Marcouiller Dec 05 '11 at 03:13
  • I have another system which has 8GB of RAM, and though it is only a Core 2 Duo, I believe it could do the job (let's cross fingers!). Now, I wonder if I am able to insert these thousands of rows on this system, and then I back them up using a full backup, will I be able to restore the database on my 4GB RAM system or will it require the same amount of memory, though the operations are different? – Will Marcouiller Dec 05 '11 at 03:21
  • 1
    IF you try to do it in smaller batches that should work. No the restore won't be the same problem i don't think – Zypher Dec 05 '11 at 03:30
  • Thanks for your assistance, Zypher. =) I shall then try to insert onto my other system and backup-and-restore it on my initial system. My question was about memory leak, so I accepted your answer as you did answer it. Thanks for further helping toward a possible solution. I'll let you know whether it worked. – Will Marcouiller Dec 05 '11 at 03:32
  • It didn't make it with the backup-restore solution. Aside, I finally went with the `BULK INSERT` SQL instruction as stated in my question on SO to solve this issue. http://stackoverflow.com/questions/8380621/how-to-use-bulk-insert-when-rows-depend-on-foreign-keys-values – Will Marcouiller Dec 08 '11 at 02:04
0

You can see if a process has a memory leak by looking at the working set in perfmon.

Jim B
  • 24,081
  • 4
  • 36
  • 60