-1

I have a client with a large file server (on 2012r2). The client accesses the server using machines running Windows 10 Pro. The client is not using a file management system and so the files are just organized manually into folders and subfolders. The client wants to copy all files on the server that contain a particular string (in the file not in the filename) to a second location. The trouble is the files are in various directories and sub-directories and the files often have the same name. Is there a way to use xcopy or robocopy from the CLI or copy-item from powershell to recursively copy only those files that contain a particular string or that have a particular author to a new location while maintaining the folder structure?

1 Answers1

1

The following script might what you want. Just edit the 3 variables at the beginning:

$sSrcDir = "U:\Data\SCRIPTS"
$sDestination = "F:\temp\t"    # No terminating backslash!
$sPattern = "ffmpeg"

(Get-ChildItem -Recurse -File -LiteralPath "$sSrcDir" | select-string -Pattern "$sPattern" -List).path | ForEach-Object {
  # Here, the string $_ holds the fullpathname of a file
  $sDirToCreate = $_ | Split-Path -NoQualifier |  Split-Path -Parent
  mkdir -Force "$sDestination$sDirToCreate"
  copy-item "$_" "$sDestination$sDirToCreate"
}

Edit: Thank you to mklement0. I updated the script to include his suggestions.

  • 1
    Nicely done; a few quibbles: use `mkdir -Force` to create a directory on demand _or_ use a preexisting one. No `;` is ever needed inside an empty `catch` or `finally` block; to ignore a terminating error, an empty `catch` block is sufficient, no need for a `finally` block too. To be safe, use `-LiteralPath` with literal (non-wildcard) paths with cmdlets such as `Get-ChildItem` and `Copy-Item`. – mklement0 Feb 26 '18 at 04:42