UPDATE
As promised I found another potential for you DaveUK. Today I learned of "RunSpaces" in PowerShell. I think this might be what we both are/were looking for. It takes a bit more work but uses Thread-Safe Collections. The below example is taken from Sharing Variables and Live Objects Between PowerShell Runspaces by Boe Prox. His article has extensive descriptions on what it does, how it works, and even doing multiple 'jobs'.
Using the Debug-Runspace you can monitor it from other sessions. I will work on providing more detail in this answer, incase the site disappears. But for now, hopefully this helps.
$hash = [hashtable]::Synchronized(@{})
$hash.value = 1
$hash.Flag = $True
$hash.Host = $host
Write-host ('Value of $Hash.value before background runspace is {0}' -f $hash.value) -ForegroundColor Green -BackgroundColor Black
$runspace = [runspacefactory]::CreateRunspace()
$runspace.Open()
$runspace.SessionStateProxy.SetVariable('Hash',$hash)
$powershell = [powershell]::Create()
$powershell.Runspace = $runspace
$powershell.AddScript({
While ($hash.Flag) {
$hash.value++
$hash.Services = Get-Service
$hash.host.ui.WriteVerboseLine($hash.value)
Start-Sleep -Seconds 5
}
}) | Out-Null
$handle = $powershell.BeginInvoke()
Previous response
I've been trying to monitor backgrounds jobs for several years without relying on the main window staying open but have yet to find a non-hacky solution. This leads me to believe its not possible using the built-in "-Job" functionality. With that this isn't really an answer to the main question but a work around that I use. Basically it's creating my own "job" setup, since the built-in doesn't function like I want it to. One note here, I rarely run "a single job" in a new process. It's usually a process that creates many jobs. If what you need is the cmdlet to run for a long time, I'd just start it in it's own PowerShell process with verbose logging to a file.
To run multiple jobs in a "background process" I would structure your cmdlets, or jobs, with a template of sorts.
- Log the output to a file, instead of trying to rely on reading it from the Receive-Job.
- Implement a way to stop the cmdlet when you're done with it, what I generally do is create a file when the job is started with the JobID, if the JobID file no longer exists, end the job. This way if you want to stop it, you remove that JobID. I usually assign names to them as well, so I remember what job it is.
- If you want a status report, do a similar item to the 'JobID' file but something like JobID-Status, and it will add a line to the log file where the status is, then removes that file.
- Finally, you have to start the job process separately from the PowerShell window you're currently in. Using Start-Process to call the script to start the job will detach it from the main window into a "background" process. Remember the script that you call should exit PowerShell when the job finishes. Something like:
Example Start-Process Script:
$Jobs = @("Name1", "Name2")
ForEach ($JobName in $Jobs) {
$Job = Start-Job -Name $JobName -ScriptBlock {
# Code Here either static file or from variable
# With logic to stop a job if the file is removed, and create the file
}
}
$JobsRunning = $true
while ($JobsRunning) {
$JobsRunning = $false
ForEach ($JobName in $Jobs) {
if ((Get-Job -Name $JobName).State -ne "Completed") {
$JobsRunning = $true
}
}
Start-Sleep -seconds 120
}
}
exit
I don't have the full solution you were wanting, but after several years of wanting this feature and reading through everything on Google, M$, SO, etc., I've come to the conclusion its not possible to transfer jobs between PowerShell window sessions.
I hope I'm wrong and someone posts how to. If not the way Outlined above works great for me. Annoying and some extra work, but works.