out-of-memory

ASP.NET MVC: Returning large amounts of data from FileResult

血红的双手。 提交于 2019-12-05 05:36:28
I have a file browser application in MVC4 that allows you to download a selected file from a controller. Currently, the FileResult returns the Stream of the file, along with the other response headers. While this works fine for smaller files, files that are larger generate an OutOfMemoryException. What I'd like to do is transmit the file from the controller, without buffering in memory in a fashion similiar to HttpReponse.TransmitFile in WebForms. How can this be accomplished? You can disable the response buffer before you return the file result. Response.BufferOutput = false; return File

java.lang.OutOfMemoryError in android while saving picture taken from camera

廉价感情. 提交于 2019-12-05 04:30:55
问题 I have an app in which I need to save my images into sdcard after taking them from camera. Here is the code: camera.takePicture(myShutterCallback, myPictureCallback_RAW, myPictureCallback_JPG); PictureCallback myPictureCallback_JPG = new PictureCallback() { @Override public void onPictureTaken(byte[] arg0, Camera arg1) { Bitmap bitmapPicture = BitmapFactory.decodeByteArray(arg0, 0, arg0.length); FileOutputStream outStream = null; try { outStream = new FileOutputStream(UploadedFilename); }

Java Heap Space - How does -Xmx work exactly?

徘徊边缘 提交于 2019-12-05 04:13:55
I have encountered the infamous OutOfMemoryException in my application and instead of simply increasing the amount of Heap Space available I tried to look into what the problem was, just in case, there was some sort of leak from my application. I added the JVM parameter -XX:+HeapDumpOnOutOfMemoryError which creates a Heap Dump when the OutOfMemory Error is encountered. I then analysed the dump file produced with different profiling tools. I then started playing around with the -Xmx parameter and observing patterns. What puzzled me is the following. Why is it that on analyzing the dump I found

Safe maximum amount of nodes in the DOM? [closed]

偶尔善良 提交于 2019-12-05 03:51:26
For a web application, given the available memory in a target mobile device 1 running a target mobile browser 2 , how can one estimate the maximum number of DOM nodes, including text nodes, that can be generated via HTML or DHTML? How can one calculate the estimate before Failure Crash Significant degradation in response Also, is there a hard limit on any browser not to cross per tab open? Regarding Prior Closure This is not like the other questions in the comments below. It is also asking a very specific question seeking a method for estimation. There is nothing duplicated, broad, or opinion

Can't set memory settings for `sbt start`

本秂侑毒 提交于 2019-12-05 03:36:28
I'm trying to run sbt start in a Play Framework application written in Scala, on a machine that is an ec2 t2.micro instance on AWS. But i can't because There is insufficient memory for the Java Runtime Environment to continue. The machine has 1GB of memory, but in practice 930MB of free memory to use while running the remaining of OS processes. It is Ubuntu Server 14.04 LTS. The app is small, cute. Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000d5550000, 715849728, 0) failed; error='Cannot allocate memory' (errno=12) # # There is insufficient memory for the Java

Apache Spark MLLib - Running KMeans with IDF-TF vectors - Java heap space

随声附和 提交于 2019-12-05 03:24:14
问题 I'm trying to run a KMeans on MLLib from a (large) collection of text documents (TF-IDF vectors). Documents are sent through a Lucene English analyzer, and sparse vectors are created from HashingTF.transform() function. Whatever the degree of parrallelism I'm using (through the coalesce function), KMeans.train always return an OutOfMemory exception below. Any thought on how to tackle this issue ? Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at scala.reflect

Python Pandas Merge Causing Memory Overflow

老子叫甜甜 提交于 2019-12-05 01:45:05
I'm new to Pandas and am trying to merge a few subsets of data. I'm giving a specific case where this happens, but the question is general: How/why is it happening and how can I work around it? The data I load is around 85 Megs or so but I often watch my python session run up close to 10 gigs of memory usage then give a memory error. I have no idea why this happens, but it's killing me as I can't even get started looking at the data the way I want to. Here's what I've done: Importing the Main data import requests, zipfile, StringIO import numpy as np import pandas as pd STAR2013url="http:/

How to solve memory segmentation and force FastMM to release memory to OS?

我是研究僧i 提交于 2019-12-05 00:30:35
问题 Note: 32 bit application, which is not planned to be migrated to 64 bit. I'm working with a very memory consuming application and have pretty much optimized all the relevant paths in respect to memory allocation/de-allocation. (there are no memory leaks, no handle leaks, no any other kind of leaks in the application itself AFAIK and tested. 3rd party libs which I cannot touch are of course candidates but unlikely in my scenario) The application will frequently allocate large single and bi

How to maxmise the largest contiguous block of memory in the Large Object Heap

前提是你 提交于 2019-12-05 00:28:13
问题 The situation is that I am making a WCF call to a remote server which is returns an XML document as a string. Most of the time this return value is a few K, sometimes a few dozen K, very occasionally a few hundred K, but very rarely it could be several megabytes (first problem is that there is no way for me to know). It's these rare occasions that are causing grief. I get a stack trace that starts: System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown. at

Errno::ENOMEM: Cannot allocate memory - cat

最后都变了- 提交于 2019-12-04 23:39:43
I have a job running on production which process xml files. xml files counts around 4k and of size 8 to 9 GB all together. After processing we get CSV files as output. I've a cat command which will merge all CSV files to a single file I'm getting: Errno::ENOMEM: Cannot allocate memory on cat (Backtick) command. Below are few details: System Memory - 4 GB Swap - 2 GB Ruby : 1.9.3p286 Files are processed using nokogiri and saxbuilder-0.0.8 . Here, there is a block of code which will process 4,000 XML files and output is saved in CSV (1 per xml) (sorry, I'm not suppose to share it b'coz of