问题
I own a Samsung Galaxy S3, which is capable of capturing pitures of size ~3000 x 2000. And, I am currently developing an application that requires capturing pictures. I use my phone as debugger and I set the best possible size that the device offers for the picture to be captured.
However, if I use this setting, the callback onPictureTaken
gives out of memory error at the very first line of it in Bitmap.decodeByteArray
method where I try to decode the captured bytes into a bitmap. If I use Bitmap.Options.inSampleSize = 2
, there occurs no out of memory error.
I want the application to be able to capture the best that the device offers, and, the device does this in its very own camera application but I can't in mine. I don't understand. How can I overcome this problem?
回答1:
http://www.youtube.com/watch?v=_CruQY55HOk. Android custom view Bitmap memory leak. Also have a look at this video. Talks about MAT analyzer which should help. Also recycle bitmaps when not in use
回答2:
An application in android must work with in and around 16mb of memory. if you are going to set the inSampleSize to a very large number to get the image quality as it is, or if u don't perform sampling then it is likely to give u the out of memory exception.
Check out this link and the sample application: Displaying Bitmaps Efficiently
来源:https://stackoverflow.com/questions/13725082/android-onpicturetaken-callback-throws-out-of-memory-exception-in-bitmap-decodeb