out-of-memory

WebAPI response causes “System.OutOfMemoryException”

删除回忆录丶 提交于 2019-12-11 07:25:26
问题 I developed a WebAPI service which returns in its response an array of complex custom objects. Some of their fields have many-to-many relations. For example: [Table("OEReferences", Schema = "public")] public class OEReference { [NotMapped] public IList<IAReference> IAReferences{ get; set; } } [Table("IAReferences", Schema = "public")] public class IAReference { [NotMapped] public IList<OEReference> OEReferences{ get; set; } } Each OEReference object has a list of IAReferences, which at the

Large images in WebView cause Out Of Memory

家住魔仙堡 提交于 2019-12-11 07:01:32
问题 I have activity that parses XML feed (with news) and loads parsed data (text and image url) in WebView's, which are inside gallery widget. Something like this: mimeType = "text/html"; encoding = "utf-8"; String html = "<img src=\"" + newsImageUrl.get(position) + "\" style=\"max-width:200px; max-height:200px;\" align=\"left\"" + "/>" + newsDescription.get(position); data.loadDataWithBaseURL("", html, mimeType, encoding, ""); Everything works fine, but sometimes inside news feed there is this

Spring WS + Wss4jSecurityInterceptor + MTOM memory issues

╄→гoц情女王★ 提交于 2019-12-11 06:58:36
问题 This seems to be somewhat a longstanding question, and no definitive solution so far: situations where your incoming MTOM messages get inlined to the SOAP message, crashing the application due to memory usage. I'm creating a file upload webservice with Spring WS (2.1) using Apache Axiom (1.2.13) because the files I receive are big: <bean id="messageFactory" class="org.springframework.ws.soap.axiom.AxiomSoapMessageFactory"> <property name="payloadCaching" value="true"/> <property name=

Boost Bimap takes too much memory in debug build

橙三吉。 提交于 2019-12-11 06:51:01
问题 I am using quite a few containers of the form boost::bimap<boost::bimaps::multiset_of<std::string>, boost::bimaps::set_of<AnEnum> > I am defining them in a header file that is included in quite a few cpp files (This is after I limited the exposure of the header file as much as possible). The .a files being created in the debug build runs to above 1 GB (resulting in compilation stopping midway due to 'no space on device' error and naturally the compile time has increased exponentially. The

Mysql: what to do when memory tables reach max_heap_table_size?

ⅰ亾dé卋堺 提交于 2019-12-11 06:45:08
问题 I'm using a mysql memory table as a way to cache data rows which are read several times. I chose this alternative because I'm not able to use xcache or memcache in my solution. After reading the mysql manual and forum about this topic I have concluded that an error will be raised when the table reaches its maximum memory size. I want to know if there is a way to catch this error in order to truncate the table and free the memory. I don't want to raise the limit of memory that can be used, I

android: cleaning up memory on app destroy

断了今生、忘了曾经 提交于 2019-12-11 06:43:57
问题 I am developing an app which instantiates a bunch of bitmap objects (e.g. buttons, which have cache bitmaps, so they don't have to get rendered again and again) Now, I realised that when I run and start the app repeatedly on my huawei mobile device, I get an OutOfMemoryException at a point where the app tries to allocate some memory for the bitmaps. So I guess it's the bitmaps which make trouble. I do know that there is a bitmap.recycle() method though. Now my question: what is best practice

out of memory error when using diag function in matlab

房东的猫 提交于 2019-12-11 06:38:07
问题 I have an array of valued double M where size(M)=15000 I need to convert this array to a diagonal matrix with command diag(M) but i get the famous error out of memory I run matlab with option -nojvm to gain memory space and with the optin 3GB switch on windows i tried also to convert my array to double precision but the problem persist any other idea? 回答1: There are much better ways to do whatever you're probably trying to do than generating the full diagonal matrix (which will be extremely

How to catch tf.errors.ResourceExhaustedError in tensorflow?

亡梦爱人 提交于 2019-12-11 06:37:19
问题 I am using Bayesian regression for hyperparameter optimization, therefore, for some parameters I am running out of memory and this is throwing an error. Therefore, this error could be generated after many iterations, but might not be generated as well. So, I am looking to catch this error and then continue running the code normally. So, I am not sure where to place try and catch block in tensorflow. I guess, the last method throwing the error is session.py . Here is the error: ---------------

Comparing 2 very large arraylists in java

六月ゝ 毕业季﹏ 提交于 2019-12-11 06:27:30
问题 What would be the correct approach when you need to compare 2 very large arraylists with each other? These arraylist are both 100,000 items in size and will definitely crash when simply comparing item per item. for (CItem c : cItems) { for (CItem r : rItems) { if (c.getID().equals(r.getID())) { Mismatch m = compareItems(c, r); if (m != null) { mismatches.add(m); } } } } Now I'm not 100% sure how the garbage collection works in this situation but the errors we get are: java.lang

c# combination process time too slow and out of memory exception

只愿长相守 提交于 2019-12-11 06:22:17
问题 I have n-elements of this kind public class Ricerca:IComparable { public int id; public double Altezza; public double lunghezza; } I need to calculate all the possible combination of this elements, my problem is that the method I am using is too slow when I have a lot of elements and sometimes I get the out of memory exception. Any suggestions? thanks for your help this is how I calculate the combination I found this method in a c# forum I didn't do it public void allMyCombination(List