I have the below PowerShell script that move files to my amazon bucket for me and all works ok for a few small files, however when copying larger files the for loop continues loop and starts they copy before others have finished and it doesn't take long before I have 100s of files all transferring at once.
what I want is to be able to limit the number of simultaneous file transfers to say 5 or 10?
foreach ($line in $csv) {
#--------------------Transfer files Put in a for each loop here---------------------------
$SourceFolder =$line.destination
$sourceFile = $line.name
if(test-Path -path $SourceFolder){
Write-S3Object -BucketName $BucketName -Key $sourceFile -File $SourceFolder
#check fro missing files
$S3GetRequest = New-Object Amazon.S3.Model.S3Object #get-S3Object -BucketName $BucketName -Key $sourceFile
$S3GetRequest = get-S3Object -BucketName $BucketName -Key $sourceFile
if($S3GetRequest -eq $null){
Write-Error "ERROR: Amazon S3 get requrest failed. Script halted."
$sourceFile + ",Transfer Error" |out-file $log_loc -append
}
}else {$SourceFolder + ",Missing File Error" |out-file $log_loc -append}
}