JVM deep memory size of an object [duplicate]

不羁的心 提交于 2019-12-10 23:45:28

问题


as far as I know the well known Instrumentation Java method is unable to correctly calculate the deep size of an object.

Is there a reliable way to compute on the JVM the correct deep size of an object?

The use case I'm thinking about is a fixed (or upper bounded) memory size data structure, i.e. a cache.

Note: as far as possible, I would like an enterprise-ready solution, so either a "standard" coding practice or a well tested library


回答1:


I know the well known Instrumentation Java method is unable to correctly calculate the deep size of an object.

With Instrumentation alone, no.

With instrumentation and a knowledge of how the memory of a particular JVM is laid out will give your number of bytes used. It won't tell you how other JVMs might work and it doesn't tell you how much data is shared.

Is there a reliable way to compute on the JVM the correct deep size of an object?

I use a profiler, but unless you believe some of the tools you use you can never know.

The use case I'm thinking about is a fixed (or upper bounded) memory size data structure, i.e. a cache.

How slow are you willing to make your cache for precise memory usage? If it is 10x or 100x slower but has very accurate usage is this better than something which just counts the number of elements?

so either a "standard" coding practice or a well tested library

In that case, use the element count. You can use LinkedHashMap or ehcache for this.



来源:https://stackoverflow.com/questions/21535189/jvm-deep-memory-size-of-an-object

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!