out-of-memory

iOS Download & Parsing Large JSON responses is causing CFData (store) leaks

微笑、不失礼 提交于 2019-12-01 14:32:53
The first time a user opens my app I need to download lots of data. I get all of this data from the server in JSON form. Depending on the user, these JSON files can be anywhere from 10kb - 30mb each, and there are 10+ of them. I have no problems doing this when the JSONs have no more than 500 or so records, but like I said some have 10,000+ records and can be up to 30mb in size. When downloading the larger JSONs, my app allocs a ton of memory, until I eventually get memory warnings and the app blows up. It seems the CFData has to be the NSMutableData that I am building in didReceiveData. As I

Leaked unreferenced byte[] originally from bitmap but recycled() causing memory-leak (until activity stopped)

冷暖自知 提交于 2019-12-01 14:19:55
I have a memory leak of bitmaps causing out of memory. I ran the tests on Android 5.0 (Samsung S5). I have investigated the issue using Android Studio (1.5.1 + 2.0.0 Preview 7). The HPROF memory dump show that there are multiple byte[] which correspond exactly to a particular huge bitmap I use temporarily. If I make sure I keep references to the bitmap then Android Studio shows me a Bitmap with 11MB dominating size and a byte[] with 11MB Shallow size. If I don't keep references to the bitmaps then some of the bitmaps become garbage collected and some end up as byte[] without incoming

OutOfMemoryError as a result of multiple searches

你离开我真会死。 提交于 2019-12-01 13:32:12
问题 I have a classic Java EE system, Web tier with JSF, EJB 3 for the BL, and Hibernate 3 doing the data access to a DB2 database. I am struggling with the following scenario: A user will initiate a process which involves retrieving a large data set from the database. The retrieval process takes some time and so the user does not receive an immediate response, gets impatient and opens a new browser and initiates the retrieval again, sometimes multiple times. The EJB container is obviously unaware

Android - Many OutOfMemoryError exceptions only on single Activity with MapView

微笑、不失礼 提交于 2019-12-01 12:34:59
问题 I'm getting quite a few OutOfMemoryError reports from my users and every single report is from the same Activity, which contains a MapView. I'm thinking that it's an isolated exception with just this one place in my app, and I can't figure out what the problem is. Can anybody give me some pointers as to why this is happening? I've removed some unneeded code for this question, so if somebody thinks the issue could potentially be in that, I'll post it. Stack Traces Stack Trace #1 java.lang

Leaked unreferenced byte[] originally from bitmap but recycled() causing memory-leak (until activity stopped)

我的梦境 提交于 2019-12-01 12:26:26
问题 I have a memory leak of bitmaps causing out of memory. I ran the tests on Android 5.0 (Samsung S5). I have investigated the issue using Android Studio (1.5.1 + 2.0.0 Preview 7). The HPROF memory dump show that there are multiple byte[] which correspond exactly to a particular huge bitmap I use temporarily. If I make sure I keep references to the bitmap then Android Studio shows me a Bitmap with 11MB dominating size and a byte[] with 11MB Shallow size. If I don't keep references to the bitmaps

How to process large video in Matlab with for loop and without memory error

怎甘沉沦 提交于 2019-12-01 12:20:29
I'm new to Matlab processing, and I would like to read and process a large video (more than 200k frames) inside a "for loop" (or without it). In particular, i would like to: read the video with VideoReader, subdivide the video into n-epoch of 1000 frames each ones, process every epoch of 1000 frames, reading: the first frame of the epoch, skip two, read the frame, skip two, and so on (for example i=1:3:nFrames), considering every epoch i need to convert every "RGB-frame" read into im2bw after the conversion i need to make the "corr2" 2D cross-correlation considering the first video frame ("mov

Free memory in Python

断了今生、忘了曾经 提交于 2019-12-01 11:40:54
How can I free part of list's memory in python? Can I do it in the following manner: del list[0:j] or for single list node: del list[j] Mark: My script analyzes huge lists and creates huge output that is why I need immediate memory deallocation. You cannot really free memory manually in Python. Using del decreases the reference count of an object. Once that reference count reaches zero, the object will be freed when the garbage collector is run. So the best you can do is to run gc.collect() manually after del -ing a bunch of objects. In these cases the best advice is usually to try and change

Memory error only in Spyder IDE

匆匆过客 提交于 2019-12-01 11:29:16
doing the following causes a MemoryError in my Spyder Python IDE: >>> from numpy import * >>> a_flt = ones((7000,7000), dtype=float64)+4 >>> b_flt = ones((7000,7000), dtype=float64)+1 Traceback (most recent call last): File "<stdin>", line 1, in <module> MemoryError >>> THis is weird, since the memory usage in the statusbar of Spyder shows that only approx. 25% of my memory is used. Furthermore, when generating even a higher number of these large 7000*7000 arrays in the standard Python IDE GUI, everything works fine. >>> from numpy import * >>> a_flt = ones((7000,7000), dtype=float64)+4 >>> b

Call GC.Collect Before Throwing OutOfMemoryException

烂漫一生 提交于 2019-12-01 10:49:39
Is there any way to get GC.Collect() to be called before throwing an OutOfMemoryException? I suppose I'm looking for a way for it to do the following code flow: Try to Allocate Memory On Pass Return Call GC.Collect() Try to Allocate Memory On Fail Throw New OutOfMemoryException() I'm writing a caching implementation and currently I'm running into memory exceptions so currently to resolve it I am using: If GC.GetTotalMemory(False) >= cache.CacheMemoryLimit + (100 * 1024 * 1024) Then // When Total Memory exceeds CacheMemoryLimit + 100MB GC.Collect() End If Maybe I'm not understanding your

Free memory in Python

半世苍凉 提交于 2019-12-01 10:25:42
问题 How can I free part of list's memory in python? Can I do it in the following manner: del list[0:j] or for single list node: del list[j] Mark: My script analyzes huge lists and creates huge output that is why I need immediate memory deallocation. 回答1: You cannot really free memory manually in Python. Using del decreases the reference count of an object. Once that reference count reaches zero, the object will be freed when the garbage collector is run. So the best you can do is to run gc