maximum image size in CIFilter / CIKernel?

女生的网名这么多〃 提交于 2021-02-06 09:30:48

问题


Does anyone know what the limitations are on image size with custom CIFilters? I've created a filter that performs as expected when the images are up to 2 mega pixels but then produce very strange results when the images are larger. I've tested this both in my cocoa app as well as in quartz composer. The filter I've developed is a geometry-type distortion filter that (I think) requires an ROI and a DOD that spans the entire input image. I've created this filter for remapping panoramic images so I'd like this to work on very large (50-100 mega pixel) images.

As a simple test the consider the following CIFilter (can be used in Quartz Composer) that simply translates the image so that the lower-left corner of the images is translated to the center (I know this could be done with an affine transform but I need to perform such an operation in a more complex filter). This filter works as expected when the image is 2000x1000 but produces odd results when the input image is 4000x2000 pixels. The problem is that either the translation does not move the corner to the center exactly or that the image output is gone entirely. I've noticed other odd problems with more complicated filters on large images but I think this simple filter illustrates my issue and can be replicated in Quartz Composer.

kernel vec4 equidistantProjection(sampler src, __color color)
{
     vec2 coordinate = samplerCoord(src);
     vec2 result;
     vec4 outputImage;

     result.x = (coordinate.x - samplerSize(src).x / 2.0);
     result.y = (coordinate.y - samplerSize(src).y / 2.0);

     outputImage = unpremultiply(sample(src,result));

     return premultiply(outputImage);
}

The same odd behavior appears when using the working coordinates instead of the sampler coordinates but in this case the error occurs for images of size 2000x1000 but works fine for images of size 1000x500

kernel vec4 equidistantProjection(sampler src, __color color, vec2 destinationDimensions)
{
     vec2 coordinate = destCoord();
     vec2 result;
     vec4 outputImage;

     result.x = (coordinate.x - destinationDimensions.x / 2.0);
     result.y = (coordinate.y - destinationDimensions.y / 2.0);

     outputImage = unpremultiply(sample(src,result));
     outputImage = unpremultiply(sample(src,samplerTransform(src, result)));

     return premultiply(outputImage);
}

For reference I have added to the Objective-C portion of my filter's - (CIImage *)outputImage method the following to set the DOD to be the entire input image.

- (CIImage *)outputImage
{
    CISampler *src = [CISampler samplerWithImage: inputImage];



     NSArray * outputExtent = [NSArray arrayWithObjects:
            [NSNumber numberWithInt:0],
            [NSNumber numberWithInt:0],
            [NSNumber numberWithFloat:[inputImage extent].size.width],
            [NSNumber numberWithFloat:[inputImage extent].size.height],nil];


return [self apply: filterKernel, src, inputColor, zoom, viewBounds, inputOrigin,
     kCIApplyOptionDefinition, [src definition], kCIApplyOptionExtent, outputExtent, nil];

}

Additionally I added the following method to set the ROI which I call in my - (id)init method with this: [filterKernel setROISelector:@selector(regionOf:destRect:userInfo:)];

- (CGRect) regionOf:(int)samplerIndex destRect:(CGRect)r userInfo:obj
{

     return r;
}

Any help or advice on this issue would be greatly appreciated. I'm sure that CIFilters can work with larger images as I've used the CIBumpDistortion with greater than 50 megapixel images so I must be doing something wrong. Any ideas?


回答1:


Working with the CoreImage I discovered that it cuts big images to parts. For example, in your case 4k x 2k image can be splitted to 4 2k x 1k images and rendered separately. Unfortunately, this optimization tricks affects samplerCoord and some coordinate-depended filters work incorrectly on big images.

My solution was in using destCoord instead of samplerCoord. Of course, you should keep in mind that an image can be rendered in non-zero origin and destCoord. I wrote my own filter, so I was able to pass whole extent as a vec4 parameter.

Example: try generate an image with CIFilter, something like that:

float gray = (samplerCoord.x / samplerSize.width) * (samplerCoord.y / samplerSize.height);

This output should give us black color at (0,0) and white at (1,1), right? However, for big images you'll see few quads, not a single gradient. This happens due to optimized rendering coming from CoreImage engine, I haven't found a way to pass it, but you can re-write the kernel this way:

float gray = ((destCoord.x - rect.x) / rect.size) * ((destCoord.y - rect.y) / rect.height)

Where rect is real extent of the sampler you must pass. I used [inputImage extent] for this purpose, but it depends on filter and can be something other in your case.

Hope this explanation made it clear. Buy the way, it looks like system kernels work just fine even with big images, so you should worry about this tricks in your custom kernels only.



来源:https://stackoverflow.com/questions/3874833/maximum-image-size-in-cifilter-cikernel

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!