Using CIEdgeWork Filters in iOS

本小妞迷上赌 提交于 2019-12-24 07:17:38

问题


I am using Core Image filters and trying to make the CIEdgeWork filter. When the filter is applied the image turns black. Am I initializing the CIFilter correctly.

 CIFilter *edgeWork = [CIFilter filterWithName:@"CIEdgeWork"
                                       keysAndValues:kCIInputImageKey,filterPreviewImage,
                             @"inputRadius",[NSNumber numberWithFloat:3.0],
                             nil];

回答1:


CIEdgeWork is not available in Core Image on iOS as of iOS 5.x, so it's no surprise that you're seeing a black image when trying to use it.

However, you can use the GPUImageSketchFilter or GPUImageThresholdEdgeDetection from my GPUImage framework to pull off this same effect. You can see the result of the first filter in this answer. The latter filter might be closer to the actual effect that Apple supplies via CIEdgeWork, given that they seem to binarize the resulting edge detected image.




回答2:


Now CIEdgeWork and CILineOverlay available for iOS9

CIEdgeWork

Also you can use CoreImage filter Sobel Sketch, based on GPUImageSketchFilter. FWKSketchFilter

Kernel of it:

kernel vec4 sketch(sampler image, float strength){
vec2 d = destCoord();

vec2 bottomLeftTextureCoordinate = samplerTransform(image, d + vec2(-1.0, -1.0));
vec2 topRightTextureCoordinate = samplerTransform(image, d + vec2(1.0, 1.0));
vec2 topLeftTextureCoordinate = samplerTransform(image, d + vec2(-1.0, 1.0));
vec2 bottomRightTextureCoordinate = samplerTransform(image, d + vec2(1.0, -1.0));

vec2 leftTextureCoordinate = samplerTransform(image, d + vec2(-1.0, 0.0));
vec2 rightTextureCoordinate = samplerTransform(image, d + vec2(1.0, 0.0));
vec2 bottomTextureCoordinate = samplerTransform(image, d + vec2(0.0, -1.0));
vec2 topTextureCoordinate = samplerTransform(image, d + vec2(0.0, 1.0));

float bottomLeftIntensity = sample(image, bottomLeftTextureCoordinate).r;
float topRightIntensity = sample(image, topRightTextureCoordinate).r;
float topLeftIntensity = sample(image, topLeftTextureCoordinate).r;
float bottomRightIntensity = sample(image, bottomRightTextureCoordinate).r;

float leftIntensity = sample(image, leftTextureCoordinate).r;
float rightIntensity = sample(image, rightTextureCoordinate).r;
float bottomIntensity = sample(image, bottomTextureCoordinate).r;
float topIntensity = sample(image, topTextureCoordinate).r;

float h = -topLeftIntensity - 2.0 * topIntensity - topRightIntensity + bottomLeftIntensity + 2.0 * bottomIntensity + bottomRightIntensity;
float v = -bottomLeftIntensity - 2.0 * leftIntensity - topLeftIntensity + bottomRightIntensity + 2.0 * rightIntensity + topRightIntensity;

float mag = 1.0 - (length(vec2(h, v))*strength);

return vec4(vec3(mag), 1.0);}


来源:https://stackoverflow.com/questions/11820520/using-ciedgework-filters-in-ios

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!