CoreImage coordinate system

若如初见. 提交于 2020-08-04 10:54:09

问题


I have CVPixelBufferRef from an AVAsset. I'm trying to apply a CIFilter to it. I use these lines:

CVPixelBufferRef pixelBuffer = ...
CVPixelBufferRef newPixelBuffer = // empty pixel buffer to fill
CIContex *context = // CIContext created from EAGLContext
CGAffineTransform preferredTransform = // AVAsset track preferred transform
CIImage *phase1 = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *phase2 = [phase1 imageByApplyingTransform:preferredTransform];
CIImage *phase3 = [self applyFiltersToImage:phase2];

[context render:phase3 toCVPixelBuffer:newPixelBuffer bounds:phase3.extent colorSpace:CGColorSpaceCreateDeviceRGB()];

Unfortunately, the result I get have an incorrect orientation. For example, a video captured in the portrait mode is upside down. I guess the problem is in going from AVAsset to CoreImage coordinate system (showing a preview in XCode for phase2 also presents an incorrect result). How to fix it?


回答1:


I solved it by doing this, It should orientate everything correctly to the coordinate space

var preferredTransform = inst.preferredTransform
preferredTransform.b *= -1
preferredTransform.c *= -1

var outputImage = CIImage(cvPixelBuffer: videoFrameBuffer)
                    .applying(preferredTransform)
outputImage = outputImage.applying(CGAffineTransform(translationX: -outputImage.extent.origin.x, y: -outputImage.extent.origin.y))


来源:https://stackoverflow.com/questions/29967700/coreimage-coordinate-system

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!