CGBitmapContextCreate on the iPhone/iPad

前端 未结 8 507

I have a method that needs to parse through a bunch of large PNG images pixel by pixel (the PNGs are 600x600 pixels each). It seems to work great on the Simulator, but on th

8条回答
  •  忘掉有多难
    2021-01-12 17:52

    Running 12 times and then crashing sounds like a running out of memory problem. It might be that internally the CGContext is creating some large autoreleased structures. Since you're doing this in a loop, they are not getting freed, so you run out of memory and die.

    I'm not sure how Core Foundation deals with temporary objects though. I don't think CF objects have the equivalent of autorelease, and a Core Graphics context is almost certainly dealing with CF objects rather than NSObjects.

    To reduce the memory churn in your code, I would suggest refactoring it to create an offscreen CGContext once before you start processing, and use it repeatedly to process each image. Then release it when you are done. That is going to be faster in any case (since you aren't allocating huge data structures on each pass through the loop.)

    I'll wager that will eliminate your crash problem, and I bet it also makes your code much, much faster. Memory allocation is very slow compared to other operations, and you're slinging around some pretty big data structures to handle 600x600 pixel RGBA images.

提交回复
热议问题