out-of-memory

What is the best way to prevent out of memory (OOM) freezes on Linux?

纵然是瞬间 提交于 2019-12-03 09:10:53
问题 Is there a way to make the OOM killer work and prevent Linux from freezing? I've been running Java and C# applications, where any memory allocated is usually used, and (if I'm understanding them right) overcommits are causing the machine to freeze. Right now, as a temporary solution, I added, vm.overcommit_memory = 2 vm.overcommit_ratio = 10 to /etc/sysctl.conf. Kudos to anyone who can explain why the existing OOM killer can't function correctly in a guaranteed manner, killing processes

Why so much memory?

隐身守侯 提交于 2019-12-03 08:52:41
I have a 1000x1500 pixel bitmap of which I want to make a mutable copy in Android. When I run the following code... // int width = original.getWidth(); // 1000px // int height = original.getHeight(); // 1500px final Bitmap result = original.copy(original.getConfig(), true); original.recycle(); ...I get an OutOfMemoryError on the copy line: java.lang.OutOfMemoryError: bitmap size exceeds VM budget ERROR/GraphicsJNI(419): VM won't let us allocate 6000000 bytes Why does the copy instruction need 6MB (!) for a 1000x1500 pixel bitmap? How can I create a mutable bitmap from a non-mutable one in more

Jasper Reports OutOfMemoryError on export

耗尽温柔 提交于 2019-12-03 08:48:35
I have written a web app for managing and running Jasper reports. Lately I've been working with some reports that generate extremely large (1500+ page) outputs, and attempting to resolve the resultant memory issues. I have discovered the JRFileVirtualizer , which has allowed me to run the report successfully with a very limited memory footprint. However, one of the features of my application is that it stores output files from previously run reports, and allows them to be exported to various formats (PDF, CSV, etc.). Therefore, I find myself in the situation of having a 500+MB .jrprint file

decode large base64 from xml in java: OutOfMemory

孤人 提交于 2019-12-03 08:46:43
I need to write a base64 encoded element of an xml file into a separate file. Problem: the file could easily reach the size of 100 MB. Every solution I tried ended with the "java.lang.OutOfMemoryError: Java heap space". The problem is not reading the xml in general or the decoding process, but the size of the base64 block. I used jdom, dom4j and XMLStreamReader to access the xml file. However, as soon as I want to access the base64 content of the respective element I get the mentioned error. I also tried an xslt using saxon's base64Binary-to-octets function, but of course with the same result.

Why do I get an OutOfMemoryError when inserting 50,000 objects into HashMap?

点点圈 提交于 2019-12-03 07:22:37
问题 I am trying to insert about 50,000 objects (and therefore 50,000 keys) into a java.util.HashMap<java.awt.Point, Segment> . However, I keep getting an OutOfMemory exception. ( Segment is my own class - very light weight - one String field, and 3 int fields). Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at java.util.HashMap.resize(HashMap.java:508) at java.util.HashMap.addEntry(HashMap.java:799) at java.util.HashMap.put(HashMap.java:431) at bus.tools.UpdateMap

Using rest-client to download a file to disk without loading it all in memory first

拜拜、爱过 提交于 2019-12-03 07:05:24
I am using rest-client to download large page (around 1.5 GB in size). Retrieved value is stored in memory than saved into a file. As result my program crashes with failed to allocate memory (NoMemoryError) . But it is not necessary to keep this data in memory, it may be even saved directly to disk. I found "You can: (...) manually handle the response (e.g. to operate on it as a stream rather than reading it all into memory) See RestClient::Request's documentation for more information." on https://github.com/rest-client/rest-client Unfortunately after reading http://www.rubydoc.info/gems/rest

View Pager with Universal Image Loader Out of Memory Error

老子叫甜甜 提交于 2019-12-03 06:10:09
问题 I am not really sure if a ViewPager with Universal Image Loader can/should be used as an alternate for a gallery like interface since I have run into an Out of Memory error while loading images from SD Card and viewing them in full screen mode. No matter what the number, it works all fine with a GridView but while viewing the images in the View Pager, each bitmap keeps eating up a lot of memory and after 10 or so images, it gives the out of memory error. I have seen almost all the questions

Android pinch zoom large image, memory efficient without losing detail

时间秒杀一切 提交于 2019-12-03 06:07:12
My app has to display a number of high resolution images (about 1900*2200 px), support pinch zoom. To avoid Out of memory error I plan to decode image to show full screen by using options.inSampleSize = scale (scale was calculated as Power of 2 as Document) (My view i used is TouchImageView extends of ImageView ) So i can quickly load image and swipe smoothly between screens(images). However, when i pinch zoom, my app loses detail because of scaled image. If i load full image, i can't load quickly or smoothly swipe, drag after pinch zoom. Then i try to only load full image when user begin

How to avoid OutOfMemoryException when running Hadoop?

走远了吗. 提交于 2019-12-03 05:59:24
问题 I'm running a Hadoop job over 1,5 TB of data with doing much pattern matching. I have several machines with 16GB RAM each, and I always get OutOfMemoryException on this job with this data (I'm using Hive). I would like to know how to optimally set option HADOOP_HEAPSIZE in file hadoop-env.sh so, my job would not fail. Is it even possible, to set this option so my jobs won't fail? When I set HADOOP_HEAPSIZE to 1,5 GB and removed half of pattern matching from query, job run successfully. So

OutOfMemoryException with gcAllowVeryLargeObjects

主宰稳场 提交于 2019-12-03 05:26:21
I'm using a BinarySerializer with a pretty big (althought not very deep) graph of items. I have 8GB of ram backed by 12Gig of swap and i'm getting an OutOfMemoryException when serializing which is expected ( it's possible the graph could go near or over 2Gb). However when i use gcAllowVeryLargeObjects it's no better, i still get the same exception and i'm definately working on something that should hold in memory (at least with the swap). Is there anything i could do to support serializing this / a way to get the same feature set but getting the result in chuncks maybe? There's nothing special