0

I have an SSIS package (in Package Deployment) that copies data from one database/table to another database/table. This package has three connections:

  • one for the source database
  • one for the staging area (target)
  • one for the configurations database

When the package is opened, it fetches an environmental variable for the config connection, then uses that connection to access SSIS_Configurations and setup two other connections. When I open that file in Visual Studio on the server itself (so the file, VS and the Config database are all on the same box), everything works as expected.

We want to de-install the tooling from the box on which the packages run. We are, to that end, setting up a workstation. This station has Visual Studio installed (along with all the relevant database drivers) and has a network drive Z:\ mapped to the first server, leading to the SSIS package. Now, if I create a project on the Z:\ and add the package to it, open the package, I get an error. It connects to the config database just fine, but when it tries to retrieve the connection strings for the other two connections, I get

Warning loading .dtsx: Not enough storage is available to complete this operation

Any idea what causes this error?

I've already checked that

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
steenbergh
  • 1,642
  • 3
  • 22
  • 40

5 Answers5

1

The issue does not appear to be that there is not enough space on disk z: When the package tries to run, it is trying to check the available storage on disc z:, which SSMS would not have read/write permissions to.

My recommendation, is to run the package locally, and have it move to your target database within that package.

Another option (NOTE: untested) - is to run the package via SSMS installed on that z: drive which would then have access to the windows interops that it needs to tell if it has space to run.

I have always ran my packages locally and pushed to targets remotely for better control.

I hope this helps some.

C Sigmon
  • 351
  • 1
  • 9
0

It's solved, finally. Turned out that somebody had manually altered the [SSIS Configuration] such that the ConfiguredValue field was NVarchar(MAX) instead of NVarchar(255). This was done because one entry was 400 chars long.

We've set it to NVarchar(500), and it works now.

steenbergh
  • 1,642
  • 3
  • 22
  • 40
0

I've ran across this issue as well. Ram and disk space is more than enough. What got rid of the error for me was to simply restart Visual Studio.

confusedKid
  • 3,231
  • 6
  • 30
  • 49
0

I got this too. In my case, I was running an "execute package" task within a "foreach loop", and I got the "Not enough storage" error after about 1100 iterations. This occurred no matter whether I ran in a computer with plenty of RAM & disk, or ran in a virtual machine with less RAM & disk: the error occurred after approximately the 1100 iterations in both cases.

From this I concluded my error was caused by a resource leak in SSIS when running the "Execute package" task.

The workaround which worked for me: I copied the contents of the sub-package to a "Sequence container" task. When I did this I was able to run the Sequence container task many thousands of times.

Edward
  • 8,028
  • 2
  • 36
  • 43
-1

On your flat file connection properties - Look for the property "AlwaysCheckForRowDelimeters" - SET it to FALSE.

Hope that helps.