1

I have an aprox 1000 line-script which loops through all servers in AD. The script pings each server and does a bunch of WMI-querys if ping=ok. My script stores the results in a hashtable that I output to a CSV at the end of the script. This works, but it is sloooow.. We are talking close to two hours. I've been looking into doing this more efficient and I think -asjob sounds like a good idea.

But can I do this as concurrent jobs? Would my server handle the load? And is -asjob the way to do it?

Hoping for some input while I'm waiting for my script to run it's cycle..

EDIT

My opinion is that the script waits for test-connection (ping) to return true or false. I would like to run multiple pings at the same time.

EDIT 2

(NOTE: I Have started a separate question as I feel my original question has been answered. I include my current code anyways as this has been requested. Thank you everyone for pitching in! new question here!)

Thank you everybody for the help so far! I've been asked to list my code to provide a real-world example of what I am trying to do.

This is a small, but valid excerpt from my code:

# List 4 servers (for testing)
$servers = Get-QADComputer -sizelimit 4 -WarningAction SilentlyContinue -OSName *server*,*hyper*

# Create list
$serverlistlist = @()

# Loop servers
foreach($server in $servers) {

    # Fetch IP
    $ipaddress = [System.Net.Dns]::GetHostAddresses($Server.name)| select-object IPAddressToString -expandproperty IPAddressToString

    # Gather OSName through WMI
    $OSName = (Get-WmiObject Win32_OperatingSystem -ComputerName $server.name ).caption

    # Ping the server
    if (Test-Connection -ComputerName $server.name -count 1 -Quiet ) {
        $reachable = "Yes"
    }

    # Save info about server
    $serverInfo = New-Object -TypeName PSObject -Property @{
        SystemName = ($server.name).ToLower()
        IPAddress = $IPAddress
        OSName = $OSName
    }
    $serverlistlist += $serverinfo | Select-Object SystemName,IPAddress,OSName
}

Notes: I am outputting $serverlist to a csv-file at the end of the script I list aprox 500 servers in my full script.

Community
  • 1
  • 1
Sune
  • 3,080
  • 16
  • 53
  • 64
  • First you need to identify what parts are slow, then post those in your question so we can suggest improvements. – Andy Arismendi Mar 06 '12 at 13:39
  • If you're using `Test-Connection` in each iteration of the loop and if that's the slowest thing you would need to re-work the logic a bit to run the ping tests concurrently because you first need a list of servers, then you can test connections to them at the same time. You don't have to do all servers at once, you could do say 10-20 at a time. – Andy Arismendi Mar 06 '12 at 18:10
  • Thanks Andy! Would it ve possible to rework my entire foreach on the server list to a start-job? Or would that be not so smart? :) – Sune Mar 06 '12 at 19:17
  • Depends on what all your loop is doing. Remember background jobs run in a seperate PowerShell.exe instance so they do not have access to objects in your current instance. They have to create the objects on their own. You can pass simple types (strings, ints, etc...) to them via ArgumentList. – Andy Arismendi Mar 06 '12 at 19:31

3 Answers3

3

I would recommend writing a function that handles all of your server validation (ping, WMI, etc.), and then wrap that up in a Start-Job command. To avoid overloading your server's resources, I'd suggest wrapping some sort of "job manager" around your task.

foreach ($Server in $ServerList) {
    while ((Get-Job -State Running).Count -ge 20) {
       Start-Sleep -Seconds 5;
    }
    # Start-Job here
}
  • Thank you Trevor! So your snippet doesn't do more than 20 concurrent jobs? That sounds smart. But how do I keep track on what is returned? – Sune Mar 07 '12 at 08:31
  • Hello Sune, you would use the Receive-Job cmdlet to receive the results of each job. You could either do this as they complete, inside the foreach loop (or better yet, in the while loop, while (haha) you're waiting for other jobs to finish), or after the foreach loop, when all jobs have finished. –  Mar 07 '12 at 15:40
1

Have you considered pinging only on failure of your first wmi request? If failures are rare, then this should cut this piece of workload out of the picture"

The ping would serve as your first step in troubleshooting why the wmi request failed.

I would combine this with kicking off multiple concurrent jobs.

How exactly are you pinging? Can you share that piece of code? There may be optimizations there as well.

marceljg
  • 997
  • 4
  • 14
0

test connect and ping are always to slow, i have a similar script. gets all the servers in the domain, and gathers statistical data from each, proc util, ram util, disk free, all the nic stats, etc

I found a test port function, cant remember where i found it but the port test is only about .3 seconds, then i run the test connect if it is successful.

For all servers test port 135, for dc's test port 389

function Verify_SinglePort{  ## returns True or false - avoids local host
Param ($CPUNAME, $port)
# This works no matter in which form we get $host - host name or ip address
If($ScriptMachineFQDN -eq $CPUNAME){ 
    ## not a good idea to test a port to and from the same machine
    return $true
}
try {
    $ip = [System.Net.Dns]::GetHostAddresses($CPUNAME) | select-object IPAddressToString -expandproperty  IPAddressToString
    if($ip.GetType().Name -eq "Object[]")
    {
        #If we have several ip's for that address, let's take first one
        $ip = $ip[0]
    }
} catch {
    #Write-Host "Possibly $CPUNAME is wrong host name or IP"
    return $false
}
$t = New-Object Net.Sockets.TcpClient
# We use Try\Catch to remove exception info from console if we can't connect
try
{
    $t.Connect($ip,$port)
} catch {}
if($t.Connected)
{
    $t.Close()
    $msg = "Port $port is operational"
    return $true
}
else
{
    $msg = "Port $port on $ip is closed, "
    $msg += "You may need to contact your IT team to open it. "  
    return $false
}
#Write-Host $msg

}