How to make batch insert with ColdFusion having more than 1000 records?

点点圈 提交于 2019-12-02 03:22:40

问题


I am having a spreadsheet which contains around 3000 records. I need to insert all these data to a new table. So in this case using batch insert mechanism is quite good.

So i tried a simple example ,

 <cfquery datasource="cse">
    insert into Names
    values
    <cfloop from="1" to="3000" index="i">
        ('#i#')
        <cfif i LT 3000>, </cfif>
    </cfloop>
</cfquery>

But as SQL Server 2008 only allows 1000 batch insert at a time I am getting error.

So how to make separate batches each containing 999 records at a time and can execute at a time?


回答1:


You can use a BULK INSERT statement that should cope with extremely large datasets.

The data will need to be in a CSV, and you'll have to create a variable to the file location.

  <cfquery datasource="cse">
    BULK INSERT Names
    FROM '#variables.sCSVLocation#'
  </cfquery>

If you have a reason not to use BULK INSERT and want to break it down into loops of 999, then you would have to work out how many 'records' are in the dataset, divide it by 999 to get the amount of times you'd have to loop over it.




回答2:


<cfquery datasource="cse">
    <cfloop from="1" to="3000" index="i">
    <cfif ((i MOD 1000) EQ 1)><!--- Each SQL INSERT Can Only Handle 1000 Rows Of Data --->
    INSERT INTO Names
    (
    [colName]
    )
    VALUES
    </cfif>
    (
        '#i#'
    )
    <cfif (i LT 3000)><cfif ((i MOD 1000) NEQ 0)>,</cfif>#CHR(13)##CHR(10)#</cfif>
    </cfloop>
</cfquery>


来源:https://stackoverflow.com/questions/22300600/how-to-make-batch-insert-with-coldfusion-having-more-than-1000-records

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!