Acceleration from device's coordinate system into absolute coordinate system

前端 未结 4 1751
梦谈多话
梦谈多话 2020-12-02 06:32

From my Android device I can read an array of linear acceleration values (in the device\'s coordinate system) and an array of absolute orientation values (in Earth\'s coordi

相关标签:
4条回答
  • 2020-12-02 06:52

    According to the documentation you get the linear acceleration in the phone's coordinate system.

    You can transform any vector from the phone's coordinate system to the Earth's coordinate system by multiplying it with the rotation matrix. You can get the rotation matrix from getRotationMatrix().

    (Perhaps there already is a function doing this multiplication for you but I don't do Android programming and I am not familiar with its API.)

    A nice tutorial on the rotation matrix is the Direction Cosine Matrix IMU: Theory manuscript. Good luck!

    0 讨论(0)
  • 2020-12-02 06:53

    Based on @alex's answer, here is the code snippet:

    private float[] gravityValues = null;
    private float[] magneticValues = null;
    
     @Override
        public void onSensorChanged(SensorEvent event) {
            if ((gravityValues != null) && (magneticValues != null) 
                && (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)) {
    
                float[] deviceRelativeAcceleration = new float[4];
                deviceRelativeAcceleration[0] = event.values[0];
                deviceRelativeAcceleration[1] = event.values[1];
                deviceRelativeAcceleration[2] = event.values[2];
                deviceRelativeAcceleration[3] = 0;
    
                // Change the device relative acceleration values to earth relative values
                // X axis -> East
                // Y axis -> North Pole
                // Z axis -> Sky
    
                float[] R = new float[16], I = new float[16], earthAcc = new float[16];
    
                SensorManager.getRotationMatrix(R, I, gravityValues, magneticValues);
    
                float[] inv = new float[16];
    
                android.opengl.Matrix.invertM(inv, 0, R, 0);
                android.opengl.Matrix.multiplyMV(earthAcc, 0, inv, 0, deviceRelativeAcceleration, 0);
               Log.d("Acceleration", "Values: (" + earthAcc[0] + ", " + earthAcc[1] + ", " + earthAcc[2] + ")");                
    
            } else if (event.sensor.getType() == Sensor.TYPE_GRAVITY) {
                gravityValues = event.values;
            } else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
                magneticValues = event.values;
            } 
       }
    
    0 讨论(0)
  • 2020-12-02 07:00

    I finally managed to solve it! So to get acceleration vector in Earth's coordinate system you need to:

    1. get rotation matrix (float[16] so it could be used later by android.opengl.Matrix class) from SensorManager.getRotationMatrix() (using SENSOR.TYPE_GRAVITY and SENSOR.TYPE_MAGNETIC_FIELD sensors values as parameters),
    2. use android.opengl.Matrix.invertM() on the rotation matrix to invert it (not transpose!),
    3. use Sensor.TYPE_LINEAR_ACCELERATION sensor to get linear acceleration vector (in device's coord. sys.),
    4. use android.opengl.Matrix.multiplyMV() to multiply the rotation matrix by linear acceleration vector.

    And there you have it! I hope I will save some precious time for others.

    Thanks for Edward Falk and Ali for hints!!

    0 讨论(0)
  • 2020-12-02 07:00

    OK, first of all, if you're trying to do actual inertial navigation on Android, you've got your work cut out for you. The cheap little sensor used in smart phones are just not precise enough. Although, there has been some interesting work done on intertial navigation over small distances, such as inside a building. There are probably papers on the subject you can dig up. Google "Motion Interface Developers Conference" and you might find something useful -- that's a conference that Invensense put on a couple months ago.

    Second, no, linear acceleration is in device coordinates, not world coordinates. You'll have to convert yourself, which means knowing the device's 3-d orientation.

    What you want to do is use a version of Android that supports the virtual sensors TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION. You'll need a device with gyros to get reasonably accurate and precise readings.

    Internally, the system combines gyros, accelerometers, and magnetometers in order to come up with true values for the device orientation. This effectively splits the accelerometer device into its gravity and acceleration components.

    So what you want to do is to set up sensor listeners for TYPE_GRAVITY, TYPE_LINEAR_ACCELERATION, and TYPE_MAGNETOMETER. Use the gravity and magnetometer data as inputs to SensorManager. getRotationMatrix() in order to get the rotation matrix that will transform world coordinates into device coordinates or vice versa. In this case, you'll want the "versa" part. That is, convert the linear acceleration input to world coordinates by multiplying them by the transpose of the orientation matrix.

    0 讨论(0)
提交回复
热议问题