1

I would like to extract a large dataset from an SQL server database on one server and then include that dataset on another database on a different server. As link server is not an option, I have tried a powershell script, and would like something like:

$connection = New-Object System.Data.SqlClient.SqlConnection
$command = New-Object System.Data.SqlClient.SqlCommand
... etc for SqlDataAdapter as well
$adapter.Fill($dataset1)

$connection2 = ... (to a database on another server)
$command2 = ...
Load $dataset1 into $command2
$command2.Fill($dataset2)

How do I load dataset 1 into command 2?

Ole Lynge
  • 4,457
  • 8
  • 43
  • 57

2 Answers2

2

Are you sure, that you want to have the whole dataset in memory ?

If you just want to copy the result of a query to another table better use data reader to avoid out of memory exceptions.

# cf. http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx

$SourceConnectionString      = "Data Source=localhost;Initial Catalog=source_db;Integrated Security=True"
$DestinationConnectionString = "Data Source=localhost;Initial Catalog=Destination_db;Integrated Security=True"

$tableName = "MyTable"
$sql = "select * FROM $tableName"

$sourceConnection  = New-Object System.Data.SqlClient.SQLConnection($ConnectionString)
$sourceConnection.open()
$commandSourceData  = New-Object system.Data.SqlClient.SqlCommand($sql,$sourceConnection)
#$commandSourceData .CommandTimeout = '300'

ps 'powershell_ise'
$reader = $commandSourceData.ExecuteReader()
ps 'powershell_ise'


try
{
    $bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $DestinationConnectionString
    $bulkCopy.DestinationTableName = $tableName
    $bulkCopy.BatchSize = 5000
    $bulkCopy.BulkCopyTimeout = 0
    $bulkCopy.WriteToServer($reader)
}
catch
{
    $ex = $_.Exception
    Write-Host "Write-DataTable$($connectionName):$ex.Message"
}
finally
{
    $reader.close()
}

Edit:

After reading Mikes comment, that PowerShell possibly unrolls the datareader object, I retried my code replacing.

$sql = "select * FROM $tableName"

by

$sql = "select * FROM $tableName union all select * FROM $tableName union all select * FROM $tableName union all select * FROM $tableName union all select * FROM $tableName "

It still worked and I had no out of memory exception in

$reader = $commandSourceData.ExecuteReader()

Until I observe problems, I have no reason to try Mikes variation.

2nd Edit:

I modified the code by adding

ps 'powershell_ise'

before and after

$reader = $commandSourceData.ExecuteReader()

I do not observe any changesin memory usage and therefore I conclude that Mikes assumption about PowerShell unrolling the datareader object doesn't apply.

bernd_k
  • 11,558
  • 7
  • 45
  • 64
1

Have you thought about the SQLBulkCopy class? Here's a powershell function that uses it to push data from one dataset to a different SQL Connection.

Mike Shepard
  • 17,466
  • 6
  • 51
  • 69
  • Yes SQLBulkCopy works great. The true challenge are the out of memory exceptions at $adapter.Fill($dataset1) when the table is really large. – bernd_k Feb 12 '11 at 16:28
  • If you pass a datareader, you don't have to pull the records down into a datatable. The SQLBulkCopy will "stream" them. In PowerShell, you will probably need to do something like this: $bulkCopy.WriteToServer($cmd.ExecuteReader()) i.e. don't call the ExecuteReader method earlier because PowerShell likes to unroll datareader objects. – Mike Shepard Feb 14 '11 at 03:01
  • @Mike Are you sure? I'm playing with a tables of 400425 rows and can't see a difference between both modes of invoking ExecuteReader. Using datatable each third trial had an out of memory exception. – bernd_k Feb 14 '11 at 15:43
  • I've copied multi-million row tables with no memory issues (using my laptop). – Mike Shepard Feb 14 '11 at 15:51
  • @Mike When PowerShell unrolls the Data Reader, than I must see a difference using ps before and after $reader = $commandSourceData.ExecuteReader(). See the edit of my code. I get no diofference. – bernd_k Feb 15 '11 at 13:01
  • I must have been dreaming when I did it the first time (it was months ago). I've recreated it without my changes and it didn't do much different in terms of memory. – Mike Shepard Feb 15 '11 at 17:36
  • the page linked no longer has that code and this answer is no longer viable nor useful – DForck42 Aug 21 '18 at 16:00