1

I'm writing a script where I'm trying to output the results from a Get-ChildItem command to a log file. The script below is simplified to show the issue I'm having. For example, the WriteLog function is used several times in the actual Script. The file listing is not the only thing to be added to the log file.

The snippet below writes a long run-on line of all full filenames to the log.

$FilePath = "G:\Folder"
$LogPathName = "G:\Folder\TestLog.log"
    
Function WriteLog {
    Param ([string]$LogString)
    $Stamp = Get-Date
    $LogMessage = "$Stamp - $LogString"
    Add-Content $LogPathName -value $LogMessage
    }
$FileList = Get-ChildItem –Path $FilePath -include ('*.csv', '*.xlsx') 

writelog $FileList 

I want each filename to begin on a new line--like a list. How can I do this?

1 Answers1

2

Leaving your function WriteLog as is, the workaround is to iterate over each element of the array returned by Get-ChildItem so that the function appends to the file line-by-line:

foreach($item in $FileList) {
    WriteLog $item
}

However a more elegant way of approaching this would be to leverage ValueFromPipeline, then you could simply pipe Get-ChildItem into your function and let the process block handle each element. You can also add a -PassThru switch to it in case you also want the same object be returned as output for later manipulation. And lastly, it may be worth adding a new -Path parameter to make it reusable.

function Write-Log {
    param(
        [Parameter(ValueFromPipeline, Mandatory)]
        [object] $InputObject,

        [parameter(Mandatory)]
        [string] $Path,

        [parameter()]
        [switch] $PassThru
    )

    begin { $sb = [System.Text.StringBuilder]::new() }
    process {
        $sb = $sb.AppendLine("$(Get-Date) - $InputObject")
        if($PassThru.IsPresent) { $InputObject }
    }
    end { Add-Content $Path -Value $sb.ToString() -NoNewline }
}

$FileList = Get-ChildItem -Path .\* -Include '*.csv', '*.xlsx' |
    Write-Log -Path "G:\Folder\TestLog.log" -PassThru
Santiago Squarzon
  • 41,465
  • 5
  • 14
  • 37
  • 1
    Might be out of the scope of the answer, but worth mentioning that `Add-Content` used like this is inefficient as it opens and closes the file for each log entry. – zett42 Aug 23 '22 at 23:57
  • @zett42 I agree but don't see a workaround for a logging function – Santiago Squarzon Aug 24 '22 at 00:30
  • 1
    Hi @SantiagoSquarzon. I believe that one workaround to avoid hitting Add-Content for every pipeline object would be that you would initialize a collection in a begin block, add each incoming object to the collection in the process block and finally add the whole collection at once in the end block – Daniel Aug 24 '22 at 03:40
  • 1
    @Daniel excellent point Daniel, that makes a lot of sense. I updated using a `StringBuilder` which seem more suitable in this case – Santiago Squarzon Aug 24 '22 at 03:55
  • A disadvantage of collecting output like this is that you can't look at the intermediate log of a long-running pipeline, because it only outputs when it is done (or never, when the script crashes). This is avoided by using SteppablePipeline [like this](https://gist.github.com/zett42/f38ce059af99d870c1591bff51244053). It basically lets you chain a "sub pipeline" to the pipeline that your command is part of. See also [this answer](https://stackoverflow.com/a/73074477/7571258). – zett42 Aug 24 '22 at 07:51
  • @zett42 not sure what you mean, `$InputObject` is being outputted in the process block, this is a streaming function. Try `Get-ChildItem . | Write-Log -Path "TestLog.log" -PassThru | % { Start-Sleep -Milliseconds 200; 'Processing: {0}' -f $_.FullName }` – Santiago Squarzon Aug 24 '22 at 13:08
  • What I mean is that the log file is only written, when `Get-ChildItem` has finished enumerating all files. Imagine a pipeline that runs for hours and then the script crashes (or there is a power outage), in which case no log entries are written and you don't know what the pipeline has done so far. – zett42 Aug 24 '22 at 13:19