2

I am having a spreadsheet which contains around 3000 records. I need to insert all these data to a new table. So in this case using batch insert mechanism is quite good.

So i tried a simple example ,

 <cfquery datasource="cse">
    insert into Names
    values
    <cfloop from="1" to="3000" index="i">
        ('#i#')
        <cfif i LT 3000>, </cfif>
    </cfloop>
</cfquery>

But as SQL Server 2008 only allows 1000 batch insert at a time I am getting error.

So how to make separate batches each containing 999 records at a time and can execute at a time?

Aacini
  • 65,180
  • 12
  • 72
  • 108
Deepak Kumar Padhy
  • 4,128
  • 6
  • 43
  • 79

2 Answers2

6

You can use a BULK INSERT statement that should cope with extremely large datasets.

The data will need to be in a CSV, and you'll have to create a variable to the file location.

  <cfquery datasource="cse">
    BULK INSERT Names
    FROM '#variables.sCSVLocation#'
  </cfquery>

If you have a reason not to use BULK INSERT and want to break it down into loops of 999, then you would have to work out how many 'records' are in the dataset, divide it by 999 to get the amount of times you'd have to loop over it.

Peter Boughton
  • 110,170
  • 32
  • 120
  • 176
Jarede
  • 3,310
  • 4
  • 44
  • 68
  • Yes Bulk insert would be a good way, But cann't I do it by creating 3 batches each of size 999? – Deepak Kumar Padhy Mar 10 '14 at 13:25
  • 1
    Re-read the last paragraph. It explains how to approach it. Though, BULK INSERT is generally more scalable. See also [importing from a spreadsheet](http://stackoverflow.com/questions/21732069/how-do-i-import-an-excel-spreadsheet-into-sql-server-2008r2-database). – Leigh Mar 10 '14 at 13:33
0
<cfquery datasource="cse">
    <cfloop from="1" to="3000" index="i">
    <cfif ((i MOD 1000) EQ 1)><!--- Each SQL INSERT Can Only Handle 1000 Rows Of Data --->
    INSERT INTO Names
    (
    [colName]
    )
    VALUES
    </cfif>
    (
        '#i#'
    )
    <cfif (i LT 3000)><cfif ((i MOD 1000) NEQ 0)>,</cfif>#CHR(13)##CHR(10)#</cfif>
    </cfloop>
</cfquery>
  • You can break the data into multiple bulk inserts. cf still sends a single request to the db. BULK INSERT isn't always the answer, if CF & Database are on different machines/networks it makes referencing a data file more difficult or impossible. – Gavin Dority Dec 11 '18 at 03:11