2

So im trying to make a backup script that will download a csv from my mssql, then zip the file, then upload the backup to amazon S3.

The issue im having is the table is 20Million lines on average when i run the script daily. and it looks like it just lags forever untill it completes like 20 minutes later. I was wondering if there is a way to show a progress bar for the invoke-sqlcmd specificly. ive done some research and all the examples i could find is to make a progress bar on a for loop only, not for a single commands progress.

Here is my code:

ECHO "Starting Download"
    Import-Module sqlps
    #$SQLquery="SELECT * FROM dbo.$PREFIX$i"
    $SQLquery="SELECT * FROM dbo.events"
    ECHO "Executing query = $SQLquery"

    $hostname = "."
    $pass = "test"
    $usern = "test"
    $database = "theDB"

    $result=invoke-sqlcmd -ServerInstance $hostname  -query $SQLquery -HostName $hostname -Password $pass -Username $usern -Database $database -verbose


    #echo $result
    pause
    $result |export-csv -path $CSVPATH -notypeinformation
    pause
    ECHO "Starting Zip:"
    Compress-Archive -LiteralPath $CSVPATH -CompressionLevel Optimal -DestinationPath $ZIPPATH

    ECHO "Starting Delete: $CSVPATH "
    del "$CSVPATH" 
    echo "Removed $CSVNAME" 

    aws s3 cp $ZIPPATH s3://test_$ZIPNAME

    pause

this script works but as i said i would like to add a progress bar to the invoke-sqlcmd so that it doesnt look like its frozen while it downloads the huge file.

this is what i could find so far but this only works for a loops progression

$VerbosePreference = "Continue"
Write-Verbose "Test Message"

for ($a=1; $a -lt 100; $a++) {
  Write-Progress -Activity "Working..." -PercentComplete $a -CurrentOperation "$a% complete" -Status "Please wait."
  Start-Sleep -Milliseconds 100
}
  • How about `bcp` instead? It will print progress whilst extracting the data. – vonPryz Jul 26 '18 at 07:28
  • or You can probably download this in batches - like: check size, divide into 10 parts and download in 10 iterations ? then You can write-progress based on how many iterations went down and - can resume download if any of smaller batches fails due to any reason :). – Tomek Jul 26 '18 at 07:56

1 Answers1

2

Considering your huge ~20 million record data set, it's probably a good idea to use some of the .NET classes in the System.Data.Common namespace. And I'm not sure about how Export-Csv is implemented, but System.IO.StreamWriter is very efficient for writing large files.

A simple tested/working example with inline comments:

# replace $tableName with yours
$sqlCount = "SELECT COUNT(*) FROM dbo.$($tableName)";
$sqlSelect = "SELECT * FROM dbo.$($tableName)";
$provider = [System.Data.Common.DbProviderFactories]::GetFactory('System.Data.SqlClient');
$connection = $provider.CreateConnection();
# replace $connectionString with yours, e.g.:
# "Data Source=$($INSTANCE-NAME);Initial Catalog=$($DATABASE-NAME);Integrated Security=True;"; 
$connection.ConnectionString = $connectionString;
$command = $connection.CreateCommand();

# get total record count for Write-Progress
$command.CommandText = $sqlCount;
$connection.Open();
$reader = $command.ExecuteReader();
$totalRecords = 0;
while ($reader.Read()) {
    $totalRecords = $reader[0];
}
$reader.Dispose();

# select CSV data
$command.CommandText = $sqlSelect;
$reader = $command.ExecuteReader();

# get CSV field names
$columnNames = @();
for ($i = 0; $i -lt $reader.FieldCount; $i++) {
    $columnNames += $reader.GetName($i);
}

# read and populate data one row at a time
$values = New-Object object[] $columnNames.Length;
$currentCount = 0;
# replace $CSVPATH with yours
$writer = New-Object System.IO.StreamWriter($CSVPATH);
$writer.WriteLine(($columnNames -join ','));
while ($reader.Read()) {
    $null = $reader.GetValues($values);
    $writer.WriteLine(($values -join ','));
    if (++$currentCount % 1000 -eq 0) {
        Write-Progress -Activity 'Reading data' `
            -Status "Finished reading $currentCount out of $totalRecords records." `
            -PercentComplete ($currentCount / $totalRecords * 100);
    }
}
$command.Dispose();
$reader.Dispose();
$connection.Dispose();
$writer.Dispose();
kuujinbo
  • 9,272
  • 3
  • 44
  • 57