问题
I'm comparing the "color distance" between two images with the same width and height to see how similar they are - the measure of similarity is just comparing them pixel by pixel and seeing how far each of their color channels are from one another.
- (NSNumber*) calculateFitness:(NSImage*)currentImage
andDestinationImage:(NSImage*)destinationImage {
NSData *tiffData = [currentImage TIFFRepresentation];
NSBitmapImageRep *currentImageRep = [NSBitmapImageRep
imageRepWithData:tiffData];
NSData *destinationImageTiffData = [destinationImage TIFFRepresentation];
NSBitmapImageRep *destinationImageRep = [NSBitmapImageRep imageRepWithData:destinationImageTiffData];
long fitnessScore = 0;
for (int width = 0; width < currentImageRep.size.width; width++) {
for (int height = 0; height < currentImageRep.size.height; height++) {
NSColor *destinationColor = [destinationImageRep colorAtX:width y:height];
NSColor *currentColor = [currentImageRep.size.height colorAtX:width y:height];
CGFloat deltaRed = (currentColor.redComponent - destinationColor.redComponent) * 255;
CGFloat deltaGreen = (currentColor.greenComponent - destinationColor.greenComponent) * 255;
CGFloat deltaBlue = (currentColor.blueComponent - destinationColor.blueComponent) * 255;
fitnessScore += (deltaRed * deltaRed) +
(deltaGreen * deltaGreen) +
(deltaBlue * deltaBlue);
}
}
}
I call this method many many times in my program to compare the fitness of thousands of images to one another. What I'm noticing in instruments is that the number of living NSCalibratedRGBColor objects keeps growing and it's due to the destinationColor
and currentColor
objects being created with NSBitmapImageRep:colorAtX:y
above. Eventually, my entire system memory will be consumed.
So - is there a reason why this happens? What am I doing wrong? Is there a more efficient way to get the raw bitmap data for my images?
Thanks
Mustafa
回答1:
You might get better performance by using the raw bitmap data. NSBitmapImageRep's -colorAtX:y:
(and -getPixel:atX:y
) are quite slow if you're going through all the image data. Also all the NSColors allocated will be held in the autorelease pool until your app returns to the main loop .
unsigned char *currentData = [currentImageRep bitmapData];
unsigned char *destinationData = [destinationImageRep bitmapData];
NSUInteger width = [currentImageRep pixelWidth];
NSUInteger height = [currentImageRep pixelHeight];
NSUInteger currentBytesPerRow = [currentImageRep bytesPerRow];
NSUInteger destBytesPerRow = [destinationImageRep bytesPerRow];
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
unsigned char *srcPixel = currentData + ((x * 4) + (y * currentBytesPerRow));
unsigned char *destPixel = destinationData + ((x * 4) + (y * destBytesPerRow));
char sr, sg, sb;
char dr, dg, db;
sr = *srcPixel;
sg = *(srcPixel + 1);
sb = *(srcPixel + 2);
dr = *destPixel;
dg = *(destPixel + 1);
db = *(destPixel + 2);
CGFloat deltaRed = (sr - dr);
CGFloat deltaGreen = (sg - dg);
CGFloat deltaBlue = (sb - db);
fitnessScore += (deltaRed * deltaRed) +
(deltaGreen * deltaGreen) +
(deltaBlue * deltaBlue);
}
}
I wrote this https://medium.com/@iainx/fast-colour-analysis-d8b6422c1135 on doing fast colour analysis, and this was one of the things I discovered.
来源:https://stackoverflow.com/questions/24565499/nsbitmapimagerepcoloratxy-returned-nscolor-keeps-growing-on-the-heap