How do I use the Microsoft Scene Understanding SDK and hololens2 to align the Unity Scene to the player's physical room? [closed]

一曲冷凌霜 提交于 2020-12-12 12:30:21

问题


When the player loads into a unity scene in the Hololens2, the Unity Floor plane does not match the physical floor. With the hololens2 and MRTK, the Unity scene origin is locked to the players head being 0,0,0.

I am trying to use the Microsoft Scene Understanding SDK to set the Unity Scene environment Y position to the floor in the physical room. I am currently able to access the floor scene object, but when I try to do the SpatialCoordinateSystem portion, I am unable to use the .ToUnity() method to convert the 4x4 Matrix. I had to change the vector3 .ToUnity() call to .ToUnityVector3(); but I am unable to find a similar method for the matrix.

Are the .ToUnity() methods mentioned in this documentation deprecated? Am I missing a reference to something? (see images for references)

I greatly appreciate any assistance here, either in this specific issue, or the overall challenge of aligning a unity scene to a Hololens2 user's physical environment.

I'm following the information provided here https://docs.microsoft.com/en-us/windows/mixed-reality/develop/platform-capabilities-and-apis/scene-understanding-sdk


回答1:


Compared with the documentation, you missed the Value property and invoked the ToUnity() method directly on the sceneToWorld object. This may be the cause of the issue.

var sceneToWorldUnity = sceneToWorld.ToUnity();

=>

var sceneToWorldUnity = sceneToWorld.Value.ToUnity();



回答2:


Hernando - MSFT's suggestions worked for me the missing ToUnity() references. The challenge of aligning all the scene holograms to the floor still remains, but this is progress! Thank you so much Hernando!

I basically just had to add .value to access the matrix data, and then implement this namespace :

namespace NumericsConversion
{
    public static class NumericsConversionExtensions
    {
        public static UnityEngine.Vector3 ToUnity(this System.Numerics.Vector3 v) => new UnityEngine.Vector3(v.X, v.Y, -v.Z);
        public static UnityEngine.Quaternion ToUnity(this System.Numerics.Quaternion q) => new UnityEngine.Quaternion(-q.X, -q.Y, q.Z, q.W);
        public static UnityEngine.Matrix4x4 ToUnity(this System.Numerics.Matrix4x4 m) => new UnityEngine.Matrix4x4(
            new Vector4(m.M11, m.M12, -m.M13, m.M14),
            new Vector4(m.M21, m.M22, -m.M23, m.M24),
            new Vector4(-m.M31, -m.M32, m.M33, -m.M34),
            new Vector4(m.M41, m.M42, -m.M43, m.M44));

        public static System.Numerics.Vector3 ToSystem(this UnityEngine.Vector3 v) => new System.Numerics.Vector3(v.x, v.y, -v.z);
        public static System.Numerics.Quaternion ToSystem(this UnityEngine.Quaternion q) => new System.Numerics.Quaternion(-q.x, -q.y, q.z, q.w);
        public static System.Numerics.Matrix4x4 ToSystem(this UnityEngine.Matrix4x4 m) => new System.Numerics.Matrix4x4(
            m.m00, m.m10, -m.m20, m.m30,
            m.m01, m.m11, -m.m21, m.m31,
           -m.m02, -m.m12, m.m22, -m.m32,
            m.m03, m.m13, -m.m23, m.m33);
    }
}

I also tried dropping a cube with a rigid-body and collider at run time to determine the Y value of the spacial data floor. This actually seems to work relatively well, but its definitely not the most precise solution.



来源:https://stackoverflow.com/questions/64937974/how-do-i-use-the-microsoft-scene-understanding-sdk-and-hololens2-to-align-the-un

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!