out-of-memory

Clojure Leining REPL OutOfMemoryError Java heap space

孤人 提交于 2019-12-04 07:18:49
I am trying to parse a fairly small (< 100MB) xml file with: (require '[clojure.data.xml :as xml] '[clojure.java.io :as io]) (xml/parse (io/reader "data/small-sample.xml")) and I am getting an error: OutOfMemoryError Java heap space clojure.lang.Numbers.byte_array (Numbers.java:1216) clojure.tools.nrepl.bencode/read-bytes (bencode.clj:101) clojure.tools.nrepl.bencode/read-netstring* (bencode.clj:153) clojure.tools.nrepl.bencode/read-token (bencode.clj:244) clojure.tools.nrepl.bencode/read-bencode (bencode.clj:254) clojure.tools.nrepl.bencode/token-seq/fn--3178 (bencode.clj:295) clojure.core

.NET System.OutOfMemoryException on String.Split() of 120 MB CSV file

一世执手 提交于 2019-12-04 06:39:40
I am using C# to read a ~120 MB plain-text CSV file. Initially I did the parsing by reading it line-by-line, but recently determined that reading the entire file contents into memory first was multiple times faster. The parsing is already quite slow because the CSV has commas embedded inside quotes, which means I have to use a regex split. This is the only one I have found that works reliably: string[] fields = Regex.Split(line, @",(?!(?<=(?:^|,)\s*\x22(?:[^\x22]|\x22\x22|\\\x22)*,) (?:[^\x22]|\x22\x22|\\\x22)*\x22\s*(?:,|$))"); // from http://regexlib.com/REDetails.aspx?regexp_id=621 In order

Why connection is terminating

自古美人都是妖i 提交于 2019-12-04 06:06:17
I'm trying a random forest classification model by using H2O library inside R on a training set having 70 million rows and 25 numeric features.The total file size is 5.6 GB. The validation file's size is 1 GB. I have 16 GB RAM and 8 core CPU on my system. The system successfully able to read both of the files in H2O object. Then I'm giving below command to build the model: model <- h2o.randomForest(x = c(1:18,20:25), y = 19, training_frame = traindata, validation_frame = testdata, ntrees = 150, mtries = 6) But after few minutes (without generating any tree), I'm getting following error: "Error

Using Stanford CoreNLP - Java heap space

穿精又带淫゛_ 提交于 2019-12-04 06:00:22
I am trying to use the coreference module of the Stanford CoreNLP pipeline, but I end up getting an OutOfMemory error in Java. I already increased the heap size (via Run->Run Configurations->VM Arguments in Eclipse) and set them to -Xmx3g -Xms1g. I even tried -Xmx12g -Xms4g, but that didn't help either. I'm using Eclipse Juno on OS X 10.8.5 with Java 1.6 on a 64-bit machine. Does anyone have an idea what else I could try? I'm using the example code from the website ( http://nlp.stanford.edu/software/corenlp.shtml ): Properties props = new Properties(); props.put("annotators", "tokenize, ssplit

Objects not being finalized and Finalizer thread not doing anything

我与影子孤独终老i 提交于 2019-12-04 05:28:46
On our server, we started to have problems with OutOfMemoryError . We analyzed the heap dumps using Eclipse Memory Analysis, and found, that many objects were held to do finalization (about 2/3 of the heap): We found, that it could be some finalize() method blocking. I found several bug reports of this problem ( here or here ), and it always manifested itself in the Finalizer thread stack, that it was blocked somewhere. But in our case, this thread was WAITING: "Finalizer" daemon prio=10 tid=0x43e1e000 nid=0x3ff in Object.wait() [0x43dfe000] java.lang.Thread.State: WAITING (on object monitor)

Ubuntu - elasticsearch - Error: Cannot allocate memory

走远了吗. 提交于 2019-12-04 05:03:00
I'm trying to install elasticsearch on my local Ubuntu machine following guide at: https://www.elastic.co/guide/en/elasticsearch/reference/current/_installation.html , and when try to run './elasticsearch', got following error: Java HotSpot(TM) 64-Bit Server VM warning: INFO: <br> os::commit_memory(0x00007f0e50cc0000, 64075595776, 0) failed; <br> error='Cannot allocate memory' (errno=12) <br> There is insufficient memory for the Java Runtime Environment to continue.<br> Native memory allocation (mmap) failed to map 64075595776 bytes for committing reserved memory Here is memory stats: total

How to handle try catch exception android

邮差的信 提交于 2019-12-04 04:35:30
I am using a method getBitmap to display images. As I am using this as a method,if it returns bitmap display an image but if it returns null,catch an exception. But if url entered is wrong also, it should handle the FileNotFoundException. How to handle two exception and display in UI? public Bitmap getBitmap(final String src) { try { InputStream stream = null; URL url = new URL(src); java.net.URL url = new java.net.URL(src); URLConnection connection = url.openConnection(); InputStream input = connection.getInputStream(); myBitmaps = BitmapFactory.decodeStream(input); return myBitmaps; } catch

WP8 Out of Memory error while loading Images

早过忘川 提交于 2019-12-04 03:49:29
问题 I am working on Windows Phone 8 app. I am working on Coverflow feature, i am trying to load 600 items but it always shows Out of Memory Error Code: <DataTemplate x:Key="DataTemplate1"> <Grid VerticalAlignment="Center" HorizontalAlignment="Center"> <Grid.RowDefinitions> <RowDefinition/> </Grid.RowDefinitions> <Border Grid.Row="0" Height="400" Width="400" CornerRadius="30,30,30,30"> <Border.Background> <ImageBrush ImageSource="Images/sample.png" /> </Border.Background> </Border> <Grid Grid.Row=

Upload files by post to server OutOfMemory

走远了吗. 提交于 2019-12-04 03:47:45
问题 I'm developing a remote backup app, and Sometimes I need upload big files as for example 15 MB, I have tested in some phones I get an out of memory error Is there a way to use less memory using this function? public int uploadFile(String sourceFileUri) { String fileName = sourceFileUri; HttpURLConnection conn = null; DataOutputStream dos = null; String lineEnd = "\r\n"; String twoHyphens = "--"; String boundary = "*****"; int bytesRead, bytesAvailable, bufferSize; byte[] buffer; int

Newtonsoft.Json - Out of memory exception while deserializing big object

雨燕双飞 提交于 2019-12-04 03:16:50
问题 I have a problem deserializing a JSON file of about 1GB. When I run the following code I get an out of memory exception: using (FileStream sr = new FileStream("myFile.json", FileMode.Open, FileAccess.Read)) { using (StreamReader reader = new StreamReader(sr)) { using (JsonReader jsReader = new JsonTextReader(reader)) { JsonSerializer serializer = new JsonSerializer(); dataObject = serializer.Deserialize<T>(jsReader); } } } the exception is thrown by Newtonsoft.Json.Linq.JTokenWriter