As I noticed when CIGaussianBlur is applied to image, image\'s corners gets blurred so that it looks like being smaller than original. So I figured out that I need to crop i
To get a nice blurred version of an image with hard edges you first need to apply a CIAffineClamp to the source image, extending its edges out and then you need to ensure that you use the input image's extents when generating the output image.
The code is as follows:
CIContext *context = [CIContext contextWithOptions:nil];
UIImage *image = [UIImage imageNamed:@"Flower"];
CIImage *inputImage = [[CIImage alloc] initWithImage:image];
CIFilter *clampFilter = [CIFilter filterWithName:@"CIAffineClamp"];
[clampFilter setDefaults];
[clampFilter setValue:inputImage forKey:kCIInputImageKey];
CIFilter *blurFilter = [CIFilter filterWithName:@"CIGaussianBlur"];
[blurFilter setValue:clampFilter.outputImage forKey:kCIInputImageKey];
[blurFilter setValue:@10.0f forKey:@"inputRadius"];
CIImage *result = [blurFilter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
UIImage result = [[UIImage alloc] initWithCGImage:cgImage scale:image.scale orientation:UIImageOrientationUp];
CGImageRelease(cgImage);
Note this code was tested on iOS. It should be the similar for OS X (substituting NSImage for UIImage).