0

Abstract

So I work for a company that has roughly 10k computer assets on my domain. My issue is the time it takes to query if a user exists on a computer to see if they've ever logged into said computer. We need this functionality for audits in case they've done something they shouldn't have.

I have two methods in mind I've researched to complete this task, and a third alternative solution I have not thought of;

-Method A: Querying every computer for the "C:\Users<USER>" to see if LocalPath exists

-Method B: Checking every computer registry for the "HKU:<SID>" to see if the SID exists

-Method C: You are all smarter than me and have a better way? XD

Method A Function

$AllCompFound = @()
$AllADComputer = Get-ADComputer -Properties Name -SearchBase "WhatsItToYa" -filter 'Name -like "*"' | Select-Object Name
ForEach($Computer in $AllADComputers) {
 $CName = $Computer.Name
 if (Get-CimInstance -ComputerName "$CName" -ClassName Win32_Profile | ? {"C:\Users\'$EDIPI'" -contains $_.LocalPath}) {
  $AllCompFound += $CName
 } else {
  #DOOTHERSTUFF
 }
}

NOTE: I have another function that prompts me to enter a username to check for. Where I work they are numbers so case sensitivity is not an issue. My issue with this function is I believe it is the 'if' statement returns true every time because it ran rather than because it matched the username.

Method B Function

$AllCompFound = @()
$AllADComputer = Get-ADComputer -Properties Name -SearchBase "WhatsItToYa" -filter 'Name -like "*"' | Select-Object Name
$hive = [Microsoft:Win32.RegistryHive]::Users
ForEach($Computer in $AllADComputers) {
 try {
 $base = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($hive, $Computer.Name)
 $key = &base.OpenSubKey($strSID)
 if ($!key) {
  #DOSTUFF
 } else {
  $AllCompFound += $Computer.Name
  #DOOTHERSTUFF
 }
} catch {
 #IDONTTHROWBECAUSEIWANTITTOCONTINUE
} finally {
 if($key) {
  $key.Close()
 }
 if ($base) {
  $base.Close()
 }
}
}

NOTE: I have another function that converts the username into a SID prior to this function. It works.

Where my eyes start to glaze over is using Invoke-Command and actually return a value back, and whether or not to run all of these queries as their own PS-Session or not. My Method A returns false positives and my Method B seems to hang up on some computers.

Neither of these methods are really fast enough to get through 10k results, I've been using smaller pools of computers in order to get test these results when requested. I'm by no means an expert, but I think I have a good understanding, so any help is appreciated!

Karmaxdw
  • 163
  • 1
  • 4
  • 13
  • If you can't come to the mountain, let the mountain come to you :) - write a script the enumerates the local profiles, then sends the list of SIDs or usernames somewhere else (to a simple web service for example). Then, using a GPO or whatever desktop management software you might have available, deploy a scheduled task to every single machine that runs the script. Then sit back and wait for the data to be automatically reported to you – Mathias R. Jessen Aug 21 '20 at 16:28
  • I like the idea, however, we do not have access/permission to use GPO or desktop management software. This also doesn't need to happen every day, so I don't want to make it a scheduled item if I'm only running these requests on demand. – Karmaxdw Aug 21 '20 at 19:00

1 Answers1

0

First, use WMI Win32_UserProfile, not C:\Users or registry.

Second, use reports from pc to some database, not from server to pc. This is much better usually.

About GPO: If you get access, you can Add\Remove scheduled task for such reports through GPP (not GPO) from time to time.

Third: Use PoshRSJob to make parallel queries.

Get-WmiObject -Class 'Win32_USerProfile' | 
    Select @(
        'SID', 
        @{ 
            Name = 'LastUseTime'; 
            Expression = {$_.ConvertToDateTime($_.LastUseTime)}}
        @{ 
            Name = 'NTAccount'; 
            Expression = { [System.Security.Principal.SecurityIdentifier]::new($_.SID).Translate([System.Security.Principal.NTAccount])}}
        )

Be careful with translating to NTAccount: if SID does not translates, it will cause error, so, maybe, it's better not to collect NTAccount from user space.

If you have no other variants, parallel jobs using PoshRSJob

Example for paralleling ( maybe there are some typos )


$ToDo = [System.Collections.Concurrent.ConcurrentQueue[string]]::new() # This is Queue (list) of computers that SHOULD be processed
<# Some loop through your computers #>
    <#...#> $ToDo.Enqueue($computerName) 
<#LoopEnd#>
$result = [System.Collections.Concurrent.ConcurrentBag[Object]]::new() # This is Bag (list) of processing results


# This function has ComputerName on input, and outputs some single value (object) as a result of processing this computer
Function Get-MySpecialComputerStats
{
    Param(
        [String]$ComputerName
    )
    <#Some magic#>
    # Here we make KSCustomObject form Hashtable. This is result object
    return [PSCustomObject]@{
        ComputerName = $ComputerName;
        Result = 'OK'
        SomeAdditionalInfo1 = 'whateverYouWant'
        SomeAdditionalInfo2 = 42 # Because 42
    }
}


# This is script that runs on background. It can not output anything.
# It takes 2 args: 1st is Input queue, 2nd is output queue

$JobScript = [scriptblock]{
    $inQueue = [System.Collections.Concurrent.ConcurrentQueue[string]]$args[0]
    $outBag = [System.Collections.Concurrent.ConcurrentBag[Object]]$args[1]
    $compName = $null
    
    # Logging inside, if you need it
    $log = [System.Text.StringBuilder]::new()
    
    # we work until inQueue is empty ( then TryDequeue will return false )
    while($inQueue.TryDequeue([ref] $compName) -eq $true)
    {
        $r= $null
        try 
        {
            $r = Get-MySpecialComputerStats -ComputerName $compName -EA Stop
            [void]$log.AppendLine("[_]: $($compName) : OK!")
            [void]$outBag.Add($r) # We append result to outBag
        }
        catch
        {
            [void]$log.AppendLine("[E]: $($compName) : $($_.Exception.Message)")
        }
    }

    # we return log.
    return $log.ToString()
}

# Some progress counters
$i_max = $ToDo.Count
$i_cur = $i_max

# We start 20 jobs. Dont forget to say about our functions be available inside job
$jobs = @(1..20) <# Run 20 threads #> | % { Start-RSJob -ScriptBlock $JobScript -ArgumentList @($ToDo, $result) -FunctionsToImport 'Get-MySpecialComputerStats'  }

# And once per 3 seconds we check, how much entries left in Queue ($todo)
while ($i_cur -gt 0)
{
    Write-Progress -Activity 'Working' -Status "$($i_cur) left of $($i_max) computers" -PercentComplete (100 - ($i_cur / $i_max * 100)) 
    Start-Sleep -Seconds 3
    $i_cur = $ToDo.Count
}

# When there is zero, we shall wait for jobs to complete last items and return logs, and we collect logs
$logs = $jobs | % { Wait-RSJob -Job $_ } | % { Receive-RSJob -Job $_ } 
# Logs is LOGS, not result

# Result is in the result variable.
$result | Export-Clixml -Path 'P:/ath/to/file.clixml' # Exporting result to CliXML file, or whatever you want

Please be careful: there is no output inside $JobScript done, so it must be perfectly done, and function Get-MySpecialComputerStats must be tested on unusual ways to return value that can be interpreted.

filimonic
  • 3,988
  • 2
  • 19
  • 26
  • And take into account: If you stop script with Ctrl+C in the middle, jobs are still running in background. Use `Get-RSJob | Stop-RSJob ` to terminate them. – filimonic Aug 21 '20 at 23:00