Are you sure, that you want to have the whole dataset in memory ?
If you just want to copy the result of a query to another table better use data reader to avoid out of memory exceptions.
# cf. http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
$SourceConnectionString = "Data Source=localhost;Initial Catalog=source_db;Integrated Security=True"
$DestinationConnectionString = "Data Source=localhost;Initial Catalog=Destination_db;Integrated Security=True"
$tableName = "MyTable"
$sql = "select * FROM $tableName"
$sourceConnection = New-Object System.Data.SqlClient.SQLConnection($ConnectionString)
$sourceConnection.open()
$commandSourceData = New-Object system.Data.SqlClient.SqlCommand($sql,$sourceConnection)
#$commandSourceData .CommandTimeout = '300'
ps 'powershell_ise'
$reader = $commandSourceData.ExecuteReader()
ps 'powershell_ise'
try
{
$bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $DestinationConnectionString
$bulkCopy.DestinationTableName = $tableName
$bulkCopy.BatchSize = 5000
$bulkCopy.BulkCopyTimeout = 0
$bulkCopy.WriteToServer($reader)
}
catch
{
$ex = $_.Exception
Write-Host "Write-DataTable$($connectionName):$ex.Message"
}
finally
{
$reader.close()
}
Edit:
After reading Mikes comment, that PowerShell possibly unrolls the datareader object, I retried my code replacing.
$sql = "select * FROM $tableName"
by
$sql = "select * FROM $tableName union all select * FROM $tableName union all select * FROM $tableName union all select * FROM $tableName union all select * FROM $tableName "
It still worked and I had no out of memory exception in
$reader = $commandSourceData.ExecuteReader()
Until I observe problems, I have no reason to try Mikes variation.
2nd Edit:
I modified the code by adding
ps 'powershell_ise'
before and after
$reader = $commandSourceData.ExecuteReader()
I do not observe any changesin memory usage and therefore I conclude that Mikes assumption about PowerShell unrolling the datareader object doesn't apply.