How to make batch insert with ColdFusion having more than 1000 records?

半世苍凉 提交于 2019-12-02 01:18:32
Jarede

You can use a BULK INSERT statement that should cope with extremely large datasets.

The data will need to be in a CSV, and you'll have to create a variable to the file location.

  <cfquery datasource="cse">
    BULK INSERT Names
    FROM '#variables.sCSVLocation#'
  </cfquery>

If you have a reason not to use BULK INSERT and want to break it down into loops of 999, then you would have to work out how many 'records' are in the dataset, divide it by 999 to get the amount of times you'd have to loop over it.

<cfquery datasource="cse">
    <cfloop from="1" to="3000" index="i">
    <cfif ((i MOD 1000) EQ 1)><!--- Each SQL INSERT Can Only Handle 1000 Rows Of Data --->
    INSERT INTO Names
    (
    [colName]
    )
    VALUES
    </cfif>
    (
        '#i#'
    )
    <cfif (i LT 3000)><cfif ((i MOD 1000) NEQ 0)>,</cfif>#CHR(13)##CHR(10)#</cfif>
    </cfloop>
</cfquery>
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!