Is there a solution to this cfqueryparam memory leak?

馋奶兔 提交于 2019-12-04 12:59:58

Do you have debugging on in Administrator?

If so, even if you've got showdebugoutput="false", CF will be keeping debug information about all of those queries, and with that many queries, the debugging information could quickly build up.


Also, if you've really got 80,000 rows to insert, you probably want to be doing this a different way - e.g. generating an import script that runs directly against the DB, (without CF/JDBC getting in the way).

Maybe multiple insert can help? This technique itself typically works faster, saving some time can help you save some memory.

Yes I've seen your note "inserting an unknown number of values", but this should work if you have constant number of fields/values in a single insterting batch.

No idea if it will make a difference, but something to try - shrink the in-function loop, and loop round the function multiple times.

What this does with memory might help narrow down where it is being used up.

<cffunction name="funcTest" output="false">
    <cfargument name="from" />
    <cfargument name="to" />
    <cfset var i = 0>
    <cfset var testq = "">
    <cfloop from="#arguments.from#" to="#arguments.to#" index="i">
        <cfquery name="testq" datasource="#dsn#">
            ...
        </cfquery>
    </cfloop>
</cffunction>


<cfset BlockSize = 100 />
<cfloop index="CurBlock" from="1" to="#(InsertCount/BlockSize)#">

    <cfset funcTest
        ( from : CurBlock*(BlockSize-1) + 1
        , to   : CurBlock*BlockSize
        )/>

</cfloop>

I encountered a similar problem.

http://misterdai.wordpress.com/2009/06/24/when-not-to-use-cfqueryparam/

The approach depends on the few things. If you can trust the data, don't use cfqueryparam's, that'll reduce memory usage a lot. From there, minimize the SQL as much as possible. I was doing quite a bit of DB work per row, so I created a stored procedure instead. The big bonus in fighting memory usage was to buffer SQL calls to the database. Create an array, append your SQL to it, then every 50 rows (personal choice after testing) do an ArrayToList on the array, inside a CfQuery tag. This limits the database traffic to less, but larger, instead of many smaller ones.

After all of that, things worked for me. But I still think ColdFusion really isn't up to this type of task, more the domain of the database server itself if possible.

My first guess would be to type the values in your cfqueryparam - as in type="CF_SQL_CHAR". Why would this help? I'm not sure, but I can guess that there would be additional overhead with a non-typed variable.

Assuming you are using CF8... not sure if this happens in CF7...

Try turning off "Max Pooled Statements" (set it to zero) in your datasource "advanced settings"... I bet money your memory leak goes away...

That is where I have found the bug to be... this was causing all kinds of crashes on some CF servers until we found this... we are 100% more stable now because of this...

Patrick Steil

try to prepend "variables." before each query inside of your cffunctions. I've had a similiar issue and this fixed it.

So change:

<cfquery name="testq" datasource="CongressPlus">

to

<cfquery name="variables.testq" datasource="CongressPlus">

Cheers,

Thomas

It's been well documented all over the community that CF will not release memory until after the request is finished. Even calling the GC directly has no effect on freeing up memory during a running request. Don't know if this is by design or a bug.

I haven't a clue why you would even want to do something like this in CF anyways. There is no reason for you to be inserting 80K rows into a database using CF, no matter which database engine you're using.

Now, if there is a reason that you need to do this, such as you're getting the data from say an uploaded CSV or XML file; MSSQL has a TON of better ways to do this and workarounds.

One approach that I have done over the years is to create a stored procedure in MSSQL that calls BCP or BULK INSERT to read a file that contains the data to insert.

The best thing about this approach is that the only thing CF is doing is handling the file upload and MMSQL is doing all the work processing the file. MSSQL has no problems inserting millions of rows using BCP or BULK INSERT and will be INFINITELY faster then anything CF can process.

The way to prevent memory leaks from cfqueryparam in a large loop of queries was to not use cfqueryparam. However a broader answer is on avoiding CF's inefficiencies and memory leaks is to not use CF in these situations. I got the particular process to an acceptable level for the load at the time but in the long run will be rewriting it in another language, probably C# directly in the database engine.

I have no idea if that would fix your problem but what I usually do when I have multiple inserts like this is, a loop of the SQL statement itself instead of the entire cfquery.

So instead of having :

<cfloop from="1" to="#insertCount#" index="i">
    <cfquery name="testq" datasource="#dsn#">
        ...
    </cfquery>
</cfloop>

I do :

<cfquery name="testq" datasource="#dsn#">
    <cfloop from="1" to="#insertCount#" index="i">
        ...
    </cfloop>
</cfquery>

So instead of having multiple call to the database you only have one big one.

I have no idea how this would affect your memory leak problem, but I never experienced any memory leaks doing it that way.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!