Out of memory exception in my code

我们两清 提交于 2019-12-06 09:42:05

Basically, I think you are barking up the wrong tree:

  • The JVM / GC will manage to deallocate unreachable objects, no matter how fast you allocate them. If you are running the classic non-concurrent GC, then JVM will simply stop doing other things until the GC has deallocated memory. If you configured your JVM to use a concurrent GC, it will try to run the GC and normal worker threads at the same time ... and revert to "stop everything and collect" behaviour if it cannot keep up.

  • If you are running out of memory, it is because something in your application (or the libraries / drivers it is using) is leaking memory. In other words, something is causing objects to remain reachable, even though your application doesn't need them any more.

As comments have pointed out, you need to address this problem methodically using a memory profiler / heap dump. Randomly changing things or blaming it on the GC is highly unlikely to fix the problem.

(When you say "... I did use stmt.close() all the time", I assume that this means that your code looks something like this:

    PreparedStatement stmt = ... 
    try {
        stmt.execute();
        // ...
    } finally {
        stmt.close();
    }

If you don't put the close call in a finally then it is possible that you are NOT calling close every time. In particular, if some exception gets thrown during the execute call or between it and the close call, then it is possible that close will not get called ... and that will result in a leak.)

I think you should add

stmt.close();

so the memory allocated to the preparedStatement will be freed.

If there is a leak, either in your code or a library, the Memory Analyser (MAT) is a free Eclipse based app for delving into Java memory dump files. Instructions include how to get it to drop the dump file for you. http://www.eclipse.org/mat/

This execution causes from OracleConnection NativeMemory. For NIO operations oracle jdbc guys decided to use native part of the memory. Most probably after executing this query too frequently makes your application to dump. To get rid of this, you can increase cache size of jdbc or restart your application in time intervals

java -Xms2g -Xmx3 -jar XX.jar
Error occurred during initialization of VM
Incompatible minimum and maximum heap sizes specified

Try

java -Xms2g -Xmx3g -jar XX.jar

How much memory do you have on your box? Are you running a 32-bit or 64-bit JVM?

Edit: seems that it may be a known Oracle driver issue: http://www.theserverside.com/discussions/thread.tss?thread_id=10218


Just a longshot, I know you are doing plain JDBC here, but if you hapen have any enhancers (AspectJ, Hibernate, JPA) there is a (slight) chance of a Perm gen leak, set -XX:MaxPermGen=256m just to be on the safe side

Also jvisualvm memory profiler and jprofiler (you can use the trial) will pin point it faster

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!