15

Is it possible to use SqlBulkcopy with Sql Compact Edition e.g. (*.sdf) files?

I know it works with SQL Server 200 Up, but wanted to check CE compatibility.

If it doesnt does anyone else know the fastest way of getting a CSV type file into SQL Server CE without using DataSets (puke here)?

Jon Seigel
  • 12,251
  • 8
  • 58
  • 92
Mark H
  • 707
  • 1
  • 11
  • 21

4 Answers4

24

BULKCOPY is not supported in SQL CE. Here is the fastest way if you have a huge number of rows in your table; insert is too slow!

using (SqlCeConnection cn = new SqlCeConnection(yourConnectionString))
{
    if (cn.State == ConnectionState.Closed)
        cn.Open();

    using (SqlCeCommand cmd = new SqlCeCommand())
    {
        cmd.Connection = cn;
        cmd.CommandText = "YourTableName";
        cmd.CommandType = CommandType.TableDirect;

        using (SqlCeResultSet rs = cmd.ExecuteResultSet(ResultSetOptions.Updatable | ResultSetOptions.Scrollable))
        {
            SqlCeUpdatableRecord record = rs.CreateRecord();

            using (var sr = new System.IO.StreamReader(yourTextFilePath))
            {
                string line;
                while ((line = sr.ReadLine()) != null)
                {
                    int index = 0;
                    string[] values = line.Split('\t');

                    //write these lines as many times as the number of columns in the table...
                    record.SetValue(index, values[index++] == "NULL" ? null : values[index - 1]);
                    record.SetValue(index, values[index++] == "NULL" ? null : values[index - 1]);
                    record.SetValue(index, values[index++] == "NULL" ? null : values[index - 1]);

                    rs.Insert(record);
                }
            }
        }
    }
}

Benchmark: table with 34370 rows

  • with inserts: 38 rows written per second

  • this way: 260 rows written per second

Alessandro Annini
  • 1,531
  • 2
  • 17
  • 32
  • This method doesn't support Identity and rowversion columns types – Boris Lipschitz Sep 13 '11 at 04:57
  • 2
    Wow, was struggling with Insert and getting 20 - 30 rows a second, I'm sure this just did all 35,774 rows in less than 1 second (all my data was in an array in memory) so no bottle necks from the source. – Matt Feb 01 '12 at 16:13
  • I converted my code using SqlCeCommand and parameters to this and while not a 10x speed up it was twice as fast, thanks. – troutinator Apr 02 '12 at 19:13
  • My previous code was trying to insert using entity framework 5 (turned off change tracking and validation, using just one savechanges call at the end) and sqlce, after leaving it to run for an hour and a half and coming back to find it still in progress, I killed the app! Rewrote using the above for the insert portion and it now inserts the same 271,703 records in 27 seconds! – Shawson Apr 20 '13 at 23:45
  • Unless I'm not grokking this (definitely possible), this seems to be getting the values from the database (StreamReader reading from "yourConnectionString"), and then inserting them back into the database...??? – B. Clay Shannon-B. Crow Raven Nov 20 '13 at 16:38
  • Thanks B. Clay Shannon, that was an error, i edited it and make it as a general txt source of data – Alessandro Annini Jan 14 '14 at 14:04
9

I have a SqlCeBulkCopy library here: http://sqlcebulkcopy.codeplex.com - even support IEnumerable.

ErikEJ
  • 40,951
  • 5
  • 75
  • 115
1

Is possible to increase a lot this kind of operation. To turn this operation usefull (I mean fast and pretty safe), u can use CE DataAdapter.

By sample, no care about keys, the steps listed bellow can help u:

  1. Make sure that sorce and target tables have same fields structure;
  2. Clone a virtual datatable with a datatable from source database (your select);
  3. Create a CE command with the table name as commandtext (TableDirect as commandtype);
  4. Create a CE dataadapter from CE command;
  5. Create a CE commandbuilder from CE dataatapter;
  6. Pass the Insert command from CE commandbuilder to CE dataadapter;
  7. Copy "n" batch rows from your source datatable to the target datatable (the clone), doing something like this:

    '... previous codes
    For Each currentRow In sourceTable.Rows
       'u can do RaiseEvent Processing(currentRow, totalRows) here with DoEvents
       If targetTable.Rows.Count < 100 Then
          targetTable.InportRow(currentRow)
          targetTable.Rows(targetTable.Rows.Count - 1).SetAdded
       Else
          '...Here you wll call the CE DataAdapter's Update method (da.Update(targetTable))
          '...and then be sure you clone the targetTable again, erasing all previous  rows.
          '...Do a clone again, don't do just a "clear" in the Rows collection.  
          '...If u have an Autoincrement it will break all Foreign Keys.
       End If
       Next
       '... next codes
    

With this way u can update several rows with no much time.

I've some applications using this method and the average rate is about 1500 rows per second in a table with 5 NTEXT fields (slow) and 800000 rows.

Of course, all depends of your table's structure. IMAGE and NTEXT are both slow datatypes.

P.S.: As I said, this method don't care so much about keys, so be carefull.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
0

No, I don't think that SqlBulkCopy is supported (see MSDN). Perhaps throw the data in as xml and strip it apart at the server? SQL/XML is pretty good in 2005/2008.

You might also want to look at table-value-parameters, but I doubt that CE supports these.

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900