How to apply iOS VNImageHomographicAlignmentObservation warpTransform?

谁说我不能喝 提交于 2019-12-01 06:49:05

问题


I'm testing Apple's Vision Alignment API and have questions regarding VNHomographicImageRegistrationRequest. Has anyone got it to work? I can get the warpTransform out of it, but I've yet to see a matrix that makes sense, meaning, I'm unable to get a result that warps the image back onto the source image. I'm using Opencv warpPerspective to handle the warping.

I'm calling this to get the transform:

class func homography(_ cgImage0 : CGImage!, _ cgImage1 : CGImage!, _ orientation : CGImagePropertyOrientation, completion:(matrix_float3x3?)-> ())
{
let registrationSequenceReqHandler = VNSequenceRequestHandler()
let requestHomography = VNHomographicImageRegistrationRequest(targetedCGImage: cgImage1, orientation: orientation)
let requestTranslation = VNTranslationalImageRegistrationRequest(targetedCGImage: cgImage1, orientation: orientation)

do
{
    try registrationSequenceReqHandler.perform([requestHomography, requestTranslation], on: cgImage0)  //reference

    if let resultH = requestHomography.results?.first as? VNImageHomographicAlignmentObservation
    {
        completion(resultH.warpTransform)
    }

    if let resultT = requestTranslation.results?.first as? VNImageTranslationAlignmentObservation
    {
        print ("translation : \(resultT.alignmentTransform.tx) : \(resultT.alignmentTransform.ty)")
    }
}
catch
{
    completion(nil)
    print("bad")
}

}

This works and outputs a homography matrix, but its results are drastically different than what I get when I do SIFT + Opencv findHomography (https://docs.opencv.org/3.0-beta/doc/tutorials/features2d/feature_homography/feature_homography.html)

Regardless of my image pairs, I'm unable to get reasonable homographic results from the Apple Vision dataset.

Thanks in advance,


回答1:


For future reference, I was able to correlate Apple's homography matrix with Opencv's matrix. Basically, the Core Image's image origin is the bottom left-hand corner of the image. Opencv's origin is the top left-hand corner. To convert Core Image's homography matrix to Opencv coordinates, one needs to apply the following transform:

H_opencv = Q * H_core_image * Q

where Q = [1 0 0; 0 -1 image.height; 0 0 1]


来源:https://stackoverflow.com/questions/52805400/how-to-apply-ios-vnimagehomographicalignmentobservation-warptransform

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!