0

(Resolved: The delay/issues I was having, was not due to the delay of adding in try-catch, or redirecting errors to null. It was a result of using usernames that were not real, for testing the script, as opposed to actual/production names. The AD lookup times are obviously different -- didn't think of that before.)

I've been editing https://github.com/dafthack/DomainPasswordSpray for a work project, with the intent of adding in logging. The script outputs to a logfile about 6 times a second, which throws ioexception errors about accessibility of the file. It's a nonterminating error.

I'm ok with the error and potentially non-logged info. However, it clutters the console.

Since it's a non-terminating error, I tried set-content -erroraction 'silentlycontinue', but the error is still displayed. However, there is no measurable performance impact -- just a cluttered console.

So researching further, I tried a try/catch block, which does eliminate the error from the console, BUT the script now takes a tad over 2x as log to run.

Is there a 3rd alternative?

I can mitigate it by splitting the data lists into two, and running the script twice, concurrently. But it's less than idea, as I'm actually already splitting the data list up to speed up the process. (single list takes 2 hours, and 30m is the goal, so I'd need 8 windows to maintain current timing...)

Anyway. Hope that makes sense. Any thoughts/input appreciated. (attempting to copy the code to a machine where I can upload a portion here for those who want to review, but gmail blocks it. working on it.)

The code causing issues:

          $FileLocation2 = "LOG-LastTested_$($Userlist)"
          # Write out the last user tried
          $tm = get-date
          # This file is written to so rapidly, errors can occur, b/c it's still open, "In use by another process.  SilentlyContinue not working, using try-catch which surpresses error (but is 2x slower).

          Try {Write-output "Num: $cu, Name: $name, Pass: $Password, Time: $tm" | set-content -erroraction 'stop' $FileLocation2}

          Catch [System.IO.IOException] {continue}```
  • Did you try to redirect errors to null? https://stackoverflow.com/a/67496840/5495709 – Mr. Sven Jul 23 '22 at 20:48
  • Since you're getting file access errors because of the frequency of errors have you thought to collect the errors in an array and only write on some time interval then clear the array and start over? BTW for Try/Catch to work you need a Terminating error thus the need for -ErrorAction Stop! – RetiredGeek Jul 23 '22 at 20:51
  • @RetiredGeek - The errors arise b/c of the rapidity of the writes (6/s, roughly), not the rapidity of the errors - does that alter your suggestion? – Stumpy Jones Jul 23 '22 at 21:01
  • @Mr.Sven I came across that this morning, but then forgot about it. Too early for me. :-) I'm going to give that a shot, once I finish testing another change, and will update here. I'm optimistic that may help - thanks! – Stumpy Jones Jul 23 '22 at 21:02
  • :-( Ok. I'm at a loss. I removed try-catch and tried 2>$null, with no difference in timing. I then reverted that change, and tested -- and the rate is still slower... So I'm not sure what gives. The only difference between the benchmarks were a change in the list of users. Not sure why that would matter - a "dummy" list vs a legit list... but then -- maybe because they're not actual users... THAT is slowing things down... not the code... *duh*. Just realized that. I'm going to put the try-catch back in, and run with a production list of users... Will let you know.... – Stumpy Jones Jul 23 '22 at 21:15
  • @StumpyJones that's why I suggested gathering the errors int an array and only writing at a time interval. Of course, you'll have to match the interval to the memory usage of the array to find the optimal solution. – RetiredGeek Jul 23 '22 at 21:22
  • Resolved: The entire issue was switching from a list of legit user names, to fake user names, for testing. The legit user names are running at the expected speed/rate. Sorry I did not realize/figure this out sooner. I do appreciate your input, it was helping for me to work through the whole process. – Stumpy Jones Jul 23 '22 at 21:26
  • @RetiredGeek Yes -- I had a followup question, as I don't use powershell much, and wanted to be sure I understood your solution, which is a bit beyond my capability. One of the reasons for logging is due to crashes/etc - I wanted a record of the last data points attempted, and I wasn't sure if the array/delaying writing information would cause a loss of some of that information. – Stumpy Jones Jul 23 '22 at 21:31
  • That would be an issue if the items being tested were on the same computer as was used to run the script. I was under the impression that you were querying a server. – RetiredGeek Jul 23 '22 at 23:17
  • Querying a server, and writing results locally, and it's an issue. – Stumpy Jones Aug 05 '22 at 23:07

1 Answers1

1

Resolved: The delay/issues I was having, was not due to the delay of adding in try-catch, or redirecting errors to null. It was a result of using usernames that were not real, for testing the script, as opposed to actual/production names. The AD lookup times are obviously different -- didn't think of that before.

I was testing the error suppression (try-catch/redirect to null) with a list of dummy users. When I checked the rate of testing, it had dropped from 10 users/second with production accounts, to 3/s with dummy accounts. After reverting all the error suppression it was still at the slower 3/s rate. So I put my try-catch lines back in, and tried with production accounts, and it worked great -- 8-10 users/sec.

So, the issue was due to the only change: using dummy accts instead of production. Once I went back to all production accounts, WITH my error surpression -- worked like a champ. Hope that helps someone else.

Thanks to @RetiredGeek and @Mr.Sven who kept me testing and looking for a solution.