0

Currently, we're exporting csv file from SQL Server by using Invoke-Sqlcmd and Export-Csv commands in PowerShell.

To upload to Redshift, we needs to enclose the data by double-quotations and convert to UTF8 encoding.

So we considered the combination of Invoke-Sqlcmd and Export-Csv easier and simple to be developed.

Here's the sample command:

Invoke-Sqlcmd –Username $StrUserName –Password $StrPassword 
              -Query $StrQry -ServerInstance $StrSvrName 
              -IgnoreProviderContext -QueryTimeOut 3000 | Export-Csv -path $StrExpFile -Encoding UTF8

There's no problem for normal amount export operations, but when the target table data exceeds over a million rows, it can consume huge amounts of memory.

If you have any god idea to reduce the cost or any alternative way to export data, enclosing by double-quotations with encoding UTF-8, please advise me.

Thank you.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Sachiko
  • 808
  • 1
  • 12
  • 31
  • 2
    `Invoke-SqlCmd` populates a `DataTable` in memory before outputting all the data at once, which is probably the culprit here. Your best bet is to build the query manually and use a `DataReader` to enumerate the results one by one – Mathias R. Jessen Sep 03 '20 at 21:12
  • 1
    Many thanks, @Mathias R. Jessen . I understand why it can consume huge amount of memory. We'll try to build the specific query for big tables as your advice. Thank you. – Sachiko Sep 04 '20 at 13:45
  • @Sachiko did you get anywhere with DataReader? I''ve got the same problem – codeulike Jun 22 '23 at 20:37

0 Answers0