6

I have a script that was working before. A robocopy script.

The backup server broke and the new one misses some configuration, but I'm not a windows guy. :'(

The script is the following:

C:\Windows\system32\Robocopy F:\Equipos \\NASSERVERBACKUP\F$\BACKUPS_NASSERVER\Equipos_Horario *.* /purge /tee /e /log:F:\ScriptBackup\LogsBackup\NASSERVERBACKUP_horario.txt /nfl /r:1 /w:1

And the output is this:

-------------------------------------------------------------------------------
   ROBOCOPY     ::     Robust File Copy for Windows                              
-------------------------------------------------------------------------------

  Started : Fri May 24 16:35:01 2013

2013/05/24 16:35:02 ERROR 1450 (0x000005AA) Getting File System Type of Destination \\NASSERVERBACKUP\F$\BACKUPS_NASSERVER\Equipos_Horario\
Insufficient system resources exist to complete the requested service.


   Source : F:\Equipos\
     Dest - \\NASSERVERBACKUP\F$\BACKUPS_NASSERVER\Equipos_Horario\

    Files : *.*

  Options : *.* /NFL /TEE /S /E /COPY:DAT /PURGE /R:1 /W:1 

------------------------------------------------------------------------------

2013/05/24 16:35:02 ERROR 1450 (0x000005AA) Accessing Destination Directory \\NASSERVERBACKUP\F$\BACKUPS_NASSERVER\Equipos_Horario\
Insufficient system resources exist to complete the requested service.

Waiting 1 seconds... Retrying...
2013/05/24 16:35:03 ERROR 1450 (0x000005AA) Accessing Destination Directory \\NASSERVERBACKUP\F$\BACKUPS_NASSERVER\Equipos_Horario\
Insufficient system resources exist to complete the requested service.


ERROR: RETRY LIMIT EXCEEDED.

2013/05/24 16:35:03 ERROR 1450 (0x000005AA) Creating Destination Directory \\NASSERVERBACKUP\F$\BACKUPS_NASSERVER\Equipos_Horario\
Insufficient system resources exist to complete the requested service.

Waiting 1 seconds... Retrying...
2013/05/24 16:35:04 ERROR 1450 (0x000005AA) Creating Destination Directory \\NASSERVERBACKUP\F$\BACKUPS_NASSERVER\Equipos_Horario\
Insufficient system resources exist to complete the requested service.


ERROR: RETRY LIMIT EXCEEDED.

2013/05/24 16:35:04 ERROR 1168 (0x00000490) Creating Destination Directory \\NASSERVERBACKUP\F$\BACKUPS_NASSERVER\Equipos_Horario\
Element not found.

Does anyone know what can be wrong?

Thanks.

Marc Riera
  • 311
  • 2
  • 5
  • 19
  • 1
    Does this really belong to stackoverflow.com? – Simon Mourier Jun 04 '13 at 15:39
  • Retry once after only a second is going to make it fail whenever there is a momentary problem. Retry at least a dozen times with a decent delay (30 seconds is the default). /R:12 /W:30 – Brian Jun 07 '13 at 12:59

4 Answers4

5

Based on a similar issue discussed here: the errors you are getting are issues related to Windows memory managent and availability of specific kind of resources (Kernel Paged Memory) which could happen during backups of big filesystems, or particularly large files.

Windows has a certain amount of memory pool space that it can allocate to programs, if the program uses all the memory available from that pool then ERROR 1450 (0x000005AA) raise up.

As literature several Microsoft Knowledge Base Articles describe this error code:

Specially Q304101 describes how to monitor resources to determine your state and offers a possible solution by tuning PoolUsageMaximum setting in Memory Management; this involvers changing registry settings therefore requires many cautions; you have been warned to read careffully the article before.

One thing you might do is separate the backup into different backups; together with monitoring memory this could help isolating the problem.

Let me suggest you an additional hint, to consider adding the /XJ switch to your command line script; this way robocopy eXclude Junctions , this is important for example when copy user accounts (the \Users.. foleder) in some Windows such as Vista, because without this you can run in loop coused by some kind of hidden links called "junctions"

InteXX
  • 6,135
  • 6
  • 43
  • 80
Franco Rondini
  • 10,841
  • 8
  • 51
  • 77
  • Assuming paged pool depletion is the issue, and the computer has >4GB of RAM, the best long term solution is to upgrade to a 64bit version of the OS. – daalbert Jun 06 '13 at 20:58
  • @daalbert **Paged pool** is only a specif part of the memory (i.e. cannot expands to all 4Gb), according to the setting documented [here](http://technet.microsoft.com/en-us/library/cc976157.aspx) the system can behave in different ways. the default setting let the system decide what is an value that can be optimal for regular operations but not good for intensive backup jobs, so it's possibile change this setting to accomplish your need. please read **[Q304101](http://support.microsoft.com/default.aspx?scid=kb;en-us;Q304101)** that provides further details. – Franco Rondini Jun 07 '13 at 05:57
  • Before everything I'd try whig /XJ option to exclude possible loops. – Franco Rondini Jun 07 '13 at 06:05
  • Sorry if I wasn't very specific with the >4GB of RAM. Paged pool is dependent on available kernel-mode virtual address space, which is limited in part by the amount of RAM the computer has. So the _potential_ amount of paged pool available will go up as you add RAM to an x86 computer until you hit a ceiling at 4GB, see [Memory Limits for Windows Releases](http://msdn.microsoft.com/en-us/library/aa366778%28v=vs.85%29.aspx#memory_limits) for details. Upgrading to a 64bit version will help a little if the computer has <=4GB, but it will significantly help if it has more. – daalbert Jun 11 '13 at 21:18
2

The Robust File Copy (Robocopy) won't be very robust if set to retry only once after waiting only one second. Retrying multiple times after waiting long enough for transitory errors to resolve will succeed a lot more often.

/R:n :: number of Retries on failed copies: default 1 million.
/W:n :: Wait time between retries: default is 30 seconds.

Retry a dozen times with a 30 second wait between:

/R:12 /W:30
Brian
  • 6,717
  • 2
  • 23
  • 31
1

I use 7-Zip to split any files larger than 4GB into 650MB Chunks. (7-Zip limits what size you can split them into but, anything under 4GB works.) I then use Robocopy to copy each chunk down and use 7-Zip to reassemble the chunks back into the file. It even works with SQL Backup files. I haven't tried to automate this yet as I don't have to do it too often.

InteXX
  • 6,135
  • 6
  • 43
  • 80
Dave
  • 11
  • 1
1

I found that using /Z restartable mode works best. It is a bit slower but at least it copes with larger files

Bigstoo
  • 11
  • 1