0

Don't beat me up to much guys, I was trying to understand the other similair questions asked about this issue, just wasn't able to decipher them.

I'm using the below code snipet to copy folders, seperately, on a "remote" computer using Start-Job:

$Folders = @('Desktop','Documents','Favorites','Links','Downloads','Music','Videos','Pictures','Contacts') 
    
    Foreach($Folder in $Folders){
        $ArgumentsArray  = @()
        $ArgumentsArray += "\\$Env:COMPUTERNAME\c$\Users\Abraham\$Folder"
        $ArgumentsArray += "C:\CopyBackup"
        $RoboCopy_Args   =  @("/E","/R:5","/W:1","/MT:13")
    
        Start-Job -ScriptBlock { & C:\Windows\System32\Robocopy.exe $args[0] "$($args[1])\$using:Folder" $using:RoboCopy_Args } -ArgumentList $ArgumentsArray -Name $Folder
    }

The code works as intended.

Now, the issue is when I try to invoke the command remotely using Invoke-Command -AsJob. Error reads:

The pipeline was not run because a pipeline is already running. Pipelines cannot be run concurrently.

Which I tried to read and understand before coming here but, wasn't able to understand why it won't work. What's odd, is that the first folder in the $Folders array (Desktop) does get copied as a job, shows completed, but the other show failed immediately. So I ran Receive-Job -Name Documents and is where I got the above error.

  • What does this have to do with the pipeline?
  • Is there a work around solution?

Failed Code:

$PSSession = New-PSSession -ComputerName $env:COMPUTERNAME -ErrorAction Stop

$Folders = @('Desktop','Documents','Favorites','Links','Downloads','Music','Videos','Pictures','Contacts') 
    
    Foreach($Folder in $Folders){
        $ArgumentsArray  = @()
        $ArgumentsArray += "C:\Users\Abraham\$Folder"
        $ArgumentsArray += "C:\CopyBackup"
        $RoboCopy_Args   =  @("/E","/R:5","/W:1","/MT:13")
    
        Invoke-Command -ScriptBlock { & C:\Windows\System32\Robocopy.exe $args[0] "$($args[1])\$using:Folder" $using:RoboCopy_Args } -AsJob -ArgumentList $ArgumentsArray -Session $PSSession -JobName $Folder
    }

Get-Job output:

Id     Name            PSJobTypeName   State         HasMoreData     Location             Command                  
--     ----            -------------   -----         -----------     --------             -------                  
518    Desktop         RemoteJob       Running       True            DESKTOP-OEREJ77       & C:\Windows\System32...
520    Documents       RemoteJob       Failed        False           DESKTOP-OEREJ77       & C:\Windows\System32...
522    Favorites       RemoteJob       Failed        False           DESKTOP-OEREJ77       & C:\Windows\System32...
524    Links           RemoteJob       Failed        False           DESKTOP-OEREJ77       & C:\Windows\System32...
526    Downloads       RemoteJob       Failed        False           DESKTOP-OEREJ77       & C:\Windows\System32...
528    Music           RemoteJob       Failed        False           DESKTOP-OEREJ77       & C:\Windows\System32...
530    Videos          RemoteJob       Failed        False           DESKTOP-OEREJ77       & C:\Windows\System32...
532    Pictures        RemoteJob       Failed        False           DESKTOP-OEREJ77       & C:\Windows\System32...
534    Contacts        RemoteJob       Failed        False           DESKTOP-OEREJ77       & C:\Windows\System32...

I'd like to stick with running this "as jobs" due to the event produced by it, StateChanged, which im using for a different purpose.

EDIT: I've decided to wrap it all into one single Invoke-Command and send it over, but it's still copying one at a time.

$PSSession = New-PSSession -ComputerName $env:COMPUTERNAME -ErrorAction Stop

    
Invoke-Command -ScriptBlock {

    $Folders = @('Desktop','Documents','Favorites','Links','Downloads','Music','Videos','Pictures','Contacts') 
        foreach ($Folder in $Folders) {



            $ArgumentsArray  = @()
            $ArgumentsArray += "C:\Users\Abraham\$Folder"
            $ArgumentsArray += "C:\CopyBackup"
            $RoboCopy_Args   =  @("/E","/R:5","/W:1","/MT:13")
        
            Start-job {

                & C:\Windows\System32\Robocopy.exe $args[0] "$($args[1])\$using:Folder" $using:RoboCopy_Args
            
            } -ArgumentList $ArgumentsArray -Name $Folder
        } 
} -AsJob -Session $PSSession -JobName $Folder
Abraham Zinala
  • 4,267
  • 3
  • 9
  • 24
  • 1
    Could be related to invoke command to the same host multiple times? Why not invoke command once as job and use runspace inside the scriptblock? – Santiago Squarzon May 30 '21 at 04:24
  • @Santiago, too dumb to know how to do so lol – Abraham Zinala May 30 '21 at 04:28
  • 1
    You also need to consider that robocopy might be over loading the disk / cpu on the remote host, I'm not sure if its a good a idea to backup multiple folders at the same time. – Santiago Squarzon May 30 '21 at 04:31
  • Just looking for the most efficient/fastest way to do it on a remote computer. I personally don't like the toll it takes on the cpu, but it's the best results I've been able to work with so far. – Abraham Zinala May 30 '21 at 04:37
  • 1
    My understanding from that error is that only 1 remote pipeline can exists with the same host and I suspect that each job is considered a pipeline since each will be sending data independently. If you don't mind running the jobs synchronously you could use a [queue](https://devblogs.microsoft.com/powershell/scaling-and-queuing-powershell-background-jobs/). – Daniel May 30 '21 at 04:54
  • 1
    If you would like the jobs to run asynchronously in parallel maybe submit all the jobs directly on the remote https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_remote_jobs?view=powershell-7.1#start-a-remote-job-that-keeps-the-results-on-the-remote-computer. Not sure how this will impact the other code you are running based on the StateChanged event though. – Daniel May 30 '21 at 04:55
  • I will look into `queue`, some fancy stuff I wouldn't of come up with. Regarding your second comment, I was in the midst of doing so already. See the edit. The only issue is just like you mentioned, i'm now unable to track the statechanged event. – Abraham Zinala May 30 '21 at 05:10
  • If you what to run your script against many computers from one computer the simple answer is to use Splitpipline https://github.com/nightroman/SplitPipeline You can run you script against thousands of computers at the same time. – Aaron Jun 01 '21 at 03:41
  • @Aaron, thank ya! I'm unable to download any modules at work. – Abraham Zinala Jun 01 '21 at 03:54

0 Answers0