Detect black pixel in image iOS

前端 未结 3 1207
一生所求
一生所求 2020-12-16 08:51

As of now I am searching every pixel 1 by 1 checking the color and seeing if it\'s black... if it isn\'t I move on to the next pixel. This is taking forever as I can only ch

相关标签:
3条回答
  • 2020-12-16 09:18

    Why are you using a timer at all? Why not just have a double for loop in your function that loops over all possible x- and y-coordinates in the image? Surely that would be waaaay faster than only checking at most 100 pixels per second. You would want to have the x (width) coordinates in the outer loop and the y (height) coordinates in the inner loop so that you are effectively scanning one column of pixels at a time from left to right, since you are trying to find the leftmost black pixel.

    Also, are you sure that each pixel in your image has a 4-byte (Uint32) representation? A standard bitmap would have 3 bytes per pixel. To check if a pixel is close to black, you would just examine each byte in the pixel separately and make sure they are all less than some threshold.

    EDIT: OK, since you are using UIGetScreenImage, I'm going to assume that it is 4-bytes per pixel.

    const UInt8 *pixels = CFDataGetBytePtr(imageData);
    UInt8 blackThreshold = 10; // or some value close to 0
    int bytesPerPixel = 4;
    for(int x = 0; x < width1; x++) {
      for(int y = 0; y < height1; y++) {
        int pixelStartIndex = (x + (y * width1)) * bytesPerPixel;
        UInt8 alphaVal = pixels[pixelStartIndex]; // can probably ignore this value
        UInt8 redVal = pixels[pixelStartIndex + 1];
        UInt8 greenVal = pixels[pixelStartIndex + 2];
        UInt8 blueVal = pixels[pixelStartIndex + 3];
        if(redVal < blackThreshold && blueVal < blackThreshold && greenVal < blackThreshold) {
          //This pixel is close to black...do something with it
        }
      }
    }
    

    If it turns out that bytesPerPixel is 3, then change that value accordingly, remove the alphaVal from the for loop, and subtract 1 from the indices of the red, green, and blue values.

    Also, my current understanding is that UIGetScreenImage is considered a private function that Apple may or may not reject you for using.

    0 讨论(0)
  • 2020-12-16 09:19

    I'm not an expert on pixel-level image processing, but my first thought is: why are you using a timer to do this? That incurs lots of overhead and makes the code less clear to read. (I think it also renders it thread-unsafe.) The overhead is not just from the timer itself but because you are doing all the data setup each time through.

    How about using a loop instead to iterate over the pixels?

    Also, you are leaking imageData (since you create it with a "Copy" method and never release it). Currently you are doing this once per timer fire (and your imageData is probably pretty big if you are working on all but the tiniest images), so you are probably leaking tons of memory.

    0 讨论(0)
  • 2020-12-16 09:27

    There is no way you should be doing this with a timer (or no reason I can think of anyway!).

    How big are your images? It should be viable to process the entire image in a single loop reasonably quickly.

    0 讨论(0)
提交回复
热议问题