out-of-memory

c# visual studio 2017 CPU profiling out of memory exception

落爺英雄遲暮 提交于 2019-12-10 17:12:15
问题 I've got a long running c# application that I'm profiling in visual studio 2017 community edition. Runs for about 2/3 hours before I stop the profiling (and application) to see the CPU usage. However I can see when its building the reports the memory usage increases by 3GB then throws an out of memory exception. Profiling started. Profiling process ID 7312 (test). Starting data collection. The output file is C:\Users\jamie\Source\Repos\test(1).vspx Profiler stopping. Stopping data collection.

Out of memory exception while parsing 5MB JSON response in Android

。_饼干妹妹 提交于 2019-12-10 17:09:52
问题 I am getting 5MB of a JSON response, I am downloading and saving in StringBuffer using a byte array with 1024 size. To parse this response, I have to create a JSONObject with the parameter as a String. While converting response into String, I am getting an out of memory exception(stringBufferVar.toString()). From service I will get the following response as max of 5 attachments each of maximum 5MB data in Base64 encoded. Following is the response from service. {"result":[{"attachment":{"name"

Stream#filter Runs out of Memory for 1,000,000 items

柔情痞子 提交于 2019-12-10 16:35:21
问题 Let's say I have a Stream of length 1,000,000 with all 1's. scala> val million = Stream.fill(100000000)(1) million: scala.collection.immutable.Stream[Int] = Stream(1, ?) scala> million filter (x => x % 2 == 0) Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded I get an Out of Memory exception. Then, I tried the same filter call with List . scala> val y = List.fill(1000000)(1) y: List[Int] = List(1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ... scala> y.filter(x => x %

Out of memory exception on 64bit

↘锁芯ラ 提交于 2019-12-10 16:33:30
问题 im trying to create the following array int numOfArrays = 50000; int lengthOfArrays = 13500; long[,] unsortedNumbers = new long[numOfArrays,lengthOfArrays]; but i keep getting the out an memory exception. Im targeting x64, i believe ive set the large-address-aware flag, see pic, yet im still getting the error. Odd thing is, i have a list in the same program which consumes 16gig of ram without any issues. System: 64gig ram 100gig free on hd. 回答1: There's a 2Gig Limit in the .NET Framework in 4

Avoiding Out Of Memory Exception in applying filters to Images (Android)

a 夏天 提交于 2019-12-10 15:43:56
问题 I am trying to apply some filters on a Image. To apply the filter, i have to first create an array: int[] arr = new int[image.width*image.height];// to store each pixel and then i can pass it to the function which will apply the filter. Problem: If i have an image greater than 500kb(around), OOME is there saying hello to me. What i tried: Divide the full image into four parts and apply filter on each part and then join them but again i got OOME in the same line, i.e when creating the int

Fatal error: Out of memory (allocated 1979711488) (tried to allocate 131072 bytes) error occur while writing xlsx file using phpexcel

五迷三道 提交于 2019-12-10 15:41:55
问题 I have integrating xlsx file for writing from database using phpexcel. I want to write 3,00,000 records in xlsx file. But it till through Fatal error: Out of memory (allocated 1979711488) (tried to allocate 131072 bytes) My PHP Version 5.3.28 Also i set php ini and cell cache see my code below ini_set('max_execution_time',-1); ini_set('memory_limit', '-1'); $cacheMethod = PHPExcel_CachedObjectStorageFactory:: cache_in_memory_gzip; $cacheSettings = array( ' memoryCacheSize ' => '-1'); PHPExcel

How to expire state of dropDuplicates in structured streaming to avoid OOM?

我是研究僧i 提交于 2019-12-10 15:17:58
问题 I want to count the unique access for each day using spark structured streaming, so I use the following code .dropDuplicates("uuid") and in the next day the state maintained for today should be dropped so that I can get the right count of unique access of the next day and avoid OOM. The spark document indicates using dropDuplicates with watermark, for example: .withWatermark("timestamp", "1 day") .dropDuplicates("uuid", "timestamp") but the watermark column must be specified in dropDuplicates

Python script terminated by SIGKILL rather than throwing MemoryError

吃可爱长大的小学妹 提交于 2019-12-10 14:56:04
问题 Update Again I have tried to create some simple way to reproduce this, but have not been successful. So far, I have tried various simple array allocations and manipulations, but they all throw an MemoryError rather than just SIGKILL crashing. For example: x =np.asarray(range(999999999)) or: x = np.empty([100,100,100,100,7]) just throw MemoryErrors as they should. I hope to have a simple way to recreate this at some point. End Update I have a python script running numpy/scipy and some custom C

function doesn't throw bad_alloc exception

落花浮王杯 提交于 2019-12-10 14:47:21
问题 I'm trying to do an exercise form Stroustrup's C++PL4 book. The task is: Allocate so much memory using new that bad_alloc is thrown. Report how much memory was allocated and how much time it took. Do this twice: once not writing to the allocated memory and once writing to each element. The following code doesn't throw a std::bad_alloc exception. After I execute the program I get message "Killed" in terminal. Also. The following code exits in ~4 seconds. But when I uncomment memory usage

OutOfMemory exception appears while scrolling the list of images

随声附和 提交于 2019-12-10 13:43:33
问题 I have a list of 70 text items with image icons(which are stored in drawables folder). If I launch application the first time and scroll the list slowly - the exception doesn't occur. When the application is launched the first time and I scroll the list with 'fling' action the following exception occurs: java.lang.OutOfMemoryError: bitmap size exceeds VM budget at android.graphics.BitmapFactory.nativeDecodeAsset(Native Method) at android.graphics.BitmapFactory.decodeStream(BitmapFactory.java