Android AR orientation

有些话、适合烂在心里 提交于 2019-12-01 16:39:57

问题


I'm making program for showing objects from map on camera and this works almost well except few degrees to left and right from vertical orientation (like in 80-110 and 260-280 degrees). In other +-320 degrees it works well. I've tried to use TYPE_ROTATION_VECTOR and accelerometer with magnetometer and they have the same result. Does anybody know any solution?

with TYPE_ROTATION_VECTOR:

if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
        {
      float[] roationV = new float[16];
            SensorManager.getRotationMatrixFromVector(roationV, event.values);

            float[] orientationValuesV = new float[3];
            SensorManager.getOrientation(roationV, orientationValuesV);

            tvHeading.setText(String.format(
                    "Coordinates: lat = %1$.2f, lon = %2$.2f, time = %3$.2f",
                    orientationValuesV[0], orientationValuesV[1], orientationValuesV[2]));

            float[] rotationMatrix=new float[16];
            mSensorManager.getRotationMatrixFromVector(rotationMatrix, event.values);
            float[] orientationValues = new float[3];
            SensorManager.getOrientation(rotationMatrix, orientationValues);
            double azimuth = Math.toDegrees(orientationValues[0]);
            double pitch = Math.toDegrees(orientationValues[1]);
            double roll = Math.toDegrees(orientationValues[2]);

            tvOrientation.setText(String.format(
                    "Coordinates: lat = %1$.2f, lon = %2$.2f, time = %3$.2f",
                    azimuth,pitch,roll));


        }

with accelerometer+magnetometer

if (event.sensor == mAccelerometer) {
            System.arraycopy(event.values, 0, mLastAccelerometer, 0, event.values.length);
            mLastAccelerometer = meanFilterAccelSmoothing
                    .addSamples(mLastAccelerometer);
            mLastAccelerometer = medianFilterAccelSmoothing
                    .addSamples(mLastAccelerometer);
            for (int i = 0; i < mLastAccelerometer.length; i++) {
                mLastAccelerometer[i] = (float) Math.floor(mLastAccelerometer[i] * 1000) / 1000;
            }
 mLastAccelerometerSet = true;
        }
        if (event.sensor == mMagnetometer) {
            System.arraycopy(event.values, 0, mLastMagnetometer, 0, event.values.length);
            mLastMagnetometer = meanFilterMagneticSmoothing.addSamples(mLastMagnetometer);
            mLastMagnetometer = medianFilterMagneticSmoothing.addSamples(mLastMagnetometer);
            for (int i = 0; i < mLastMagnetometer.length; i++) {
                mLastMagnetometer[i] = (float) Math.floor(mLastMagnetometer[i] * 1000) / 1000;
            }
            mLastMagnetometerSet = true;
        }

 if (mLastAccelerometerSet && mLastMagnetometerSet) {

            SensorManager.getRotationMatrix(mR, null, mLastAccelerometer, mLastMagnetometer);
            SensorManager.getOrientation(mR, mOrientation);
            if (angeles.size() > 0) {
                for (int i = 0; i < mapObjects.size(); i++) {
                    compassFunc(i, mOrientation[0], mOrientation[1], mOrientation[2]);

                }
            }

private void compassFunc(int number, float... values) {

        double angularXSpeed = Math.floor(values[0] * 180 / Math.PI * 100) / 100;
        double angularYSpeed = Math.floor(values[1] * 180 / Math.PI * 100) / 100;
        double angularZSpeed = Math.floor(values[2] * 180 / Math.PI * 100) / 100;

tvOrientation.setText(String.format(
                "Screen: lt= %1$.2f : %2$.2f,rt= %3$.2f : %4$.2f,lb= %5$.2f : %6$.2f,rb= %7$.2f : %8$.2f",
                xLeftTop,  yLeftTop, xRightTop,yRightTop,xLeftBottom,yLeftBottom,xRightBottom,yRightBottom));
}

回答1:


This sounds like a typical case of Gimbal Lock. Your description of how rotation around one axis acts up when another reaches +-90 degrees suggests that this is indeed the case.

This is a fundamental problem with Euler angles (Yaw/Azimuth, Pitch, Roll), which is why most such computations are done using rotation matrices or quaternions, and Euler angles most often only are used when a particular orientation is to be displayed to a human (humans are generally bad at interpreting rotation matrices and quaternions).

The ROTATION_VECTOR sensor outputs it's data in a quaternion format (source), albeit with rearranged values, and the getRotationMatrixFromVector() method turns this into a rotation matrix. I would suggest using one of these descriptions for your internal calculations.

The answers to this similar question provide some concrete suggestions on how to solve the issue.



来源:https://stackoverflow.com/questions/30696782/android-ar-orientation

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!