How would you detect touches only on non-transparent pixels of a UIImageView
, efficiently?
Consider an image like the one below, displayed with UI
Well, if you need to do it really fast, you need to precalculate the mask.
Here's how to extract it:
UIImage *image = [UIImage imageNamed:@"some_image.png"];
NSData *data = (NSData *) CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage));
unsigned char *pixels = (unsigned char *)[data bytes];
BOOL *mask = (BOOL *)malloc(data.length);
for (int i = 0; i < data.length; i += 4) {
mask[i >> 2] = pixels[i + 3] == 0xFF; // alpha, I hope
}
// TODO: save mask somewhere
Or you could use the 1x1 bitmap context solution to precalculate the mask. Having a mask means you can check any point with the cost of one indexed memory access.
As for checking a bigger area than one pixel - I would check pixels on a circle with the center in the touch point. About 16 points on the circle should be enough.
Detecting also inner pixels: another precalculation step - you need to find the convex hull of the mask. You can do that using the "Graham scan" algorithm http://softsurfer.com/Archive/algorithm_0109/algorithm_0109.htm Then either fill that area in the mask, or save the polygon and use a point-in-polygon test instead.
And finally, if the image has a transform, you need to convert the point coordinates from screen space to image space, and then you can just check the precalculated mask.