Transforming accelerometer's data from device's coordinates to real world coordinates

好久不见. 提交于 2019-11-27 12:14:20

Oki, I have worked this out mathematically myself so please bear with me.

If you want to translate an acceleration vector accelerationvalues into an acceleration vector trueacceleration expressed in real world's coordinates, once you have azimuth,pitch and roll stored in a orientationvalues vector, just do the following:

                trueacceleration[0] =(float) (accelerometervalues[0]*(Math.cos(orientationvalues[2])*Math.cos(orientationvalues[0])+Math.sin(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.sin(orientationvalues[0])) + accelerometervalues[1]*(Math.cos(orientationvalues[1])*Math.sin(orientationvalues[0])) + accelerometervalues[2]*(-Math.sin(orientationvalues[2])*Math.cos(orientationvalues[0])+Math.cos(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.sin(orientationvalues[0])));
            trueacceleration[1] = (float) (accelerometervalues[0]*(-Math.cos(orientationvalues[2])*Math.sin(orientationvalues[0])+Math.sin(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.cos(orientationvalues[0])) + accelerometervalues[1]*(Math.cos(orientationvalues[1])*Math.cos(orientationvalues[0])) + accelerometervalues[2]*(Math.sin(orientationvalues[2])*Math.sin(orientationvalues[0])+ Math.cos(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.cos(orientationvalues[0])));
            trueacceleration[2] = (float) (accelerometervalues[0]*(Math.sin(orientationvalues[2])*Math.cos(orientationvalues[1])) + accelerometervalues[1]*(-Math.sin(orientationvalues[1])) + accelerometervalues[2]*(Math.cos(orientationvalues[2])*Math.cos(orientationvalues[1])));

You need to be able to know the reference coordinate system that also gives you the orientation of your device within 'real' world coordinates. Without that information, it look impossible to transform your data into anything useful.

For example, does your device have a type of 'directional' sensor that would help make sense of the accelerometer data (gyro & compass for example?)

I am dealing with the same problem. What you can do is, as you have the R[] matrix multiply your acceleration vector and voilá.

float resultVec[] = new float[4];
Matrix.multiplyMV(trueacceleration, 0, R, 0, accelerometervalues, 0);

PS: accelerometervalues must be a 4 field vector, just add 0 to the last field.

Try this, its working for me

private float[] gravityValues = null;
    private float[] magneticValues = null;
    private SensorManager mSensorManager = null;  
private void registerSensorListener(Context context) {
        mSensorManager = (SensorManager) context.getSystemService(SENSOR_SERVICE);
        mSensorManager.registerListener(this,
                mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
                SensorManager.SENSOR_DELAY_FASTEST);

        mSensorManager.registerListener(this,
                mSensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE),
                SensorManager.SENSOR_DELAY_FASTEST);

        mSensorManager.registerListener(this,
                mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
                SensorManager.SENSOR_DELAY_FASTEST);

        mSensorManager.registerListener(this,
                mSensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY),
                SensorManager.SENSOR_DELAY_FASTEST);
    }

    @Override
    public void onSensorChanged(SensorEvent event) {
        if ((gravityValues != null) && (magneticValues != null)
                && (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)) {

            float[] deviceRelativeAcceleration = new float[4];
            deviceRelativeAcceleration[0] = event.values[0];
            deviceRelativeAcceleration[1] = event.values[1];
            deviceRelativeAcceleration[2] = event.values[2];
            deviceRelativeAcceleration[3] = 0;

            Log.d("Raw Acceleration::","Values: (" + event.values[0] + ", " + event.values[1] + ", " + event.values[2] + ")");

            // Change the device relative acceleration values to earth relative values
            // X axis -> East
            // Y axis -> North Pole
            // Z axis -> Sky

            float[] R = new float[16], I = new float[16], earthAcc = new float[16];

            SensorManager.getRotationMatrix(R, I, gravityValues, magneticValues);

            float[] inv = new float[16];

            android.opengl.Matrix.invertM(inv, 0, R, 0);
            android.opengl.Matrix.multiplyMV(earthAcc, 0, inv, 0, deviceRelativeAcceleration, 0);
            Log.d("Earth Acceleration", "Values: (" + earthAcc[0] + ", " + earthAcc[1] + ", " + earthAcc[2] + ")");

        } else if (event.sensor.getType() == Sensor.TYPE_GRAVITY) {
            gravityValues = event.values;
        } else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
            magneticValues = event.values;
        }
    }

This is what I used to map accelrometer data from local(Mobile) frame of reference to Earth frame of reference, to get rid of orientation in dependency. Since in earth frame Z-axis is pointing towards the sky and must show value ~=9.81m/sec^2. One phenomenon that I couldn't understand is when I put phone on the revolving chair any any orientation and rotate at constant speed then XEarth and YEarth values shows rotation with 90 degree phase shift and oscillates like a sin/cosine wave which i assume North and East axis.

public void onSensorChanged(SensorEvent event) {

        switch(event.sensor.getType()){

           case Sensor.TYPE_ACCELEROMETER:
                 System.arraycopy(event.values, 0, accel, 0, 3);
                     //To get Quternion representation of Accelrometer data              
                     SensorManager.getQuaternionFromVector(quatA , event.values);
             q1.w = quatA[0]; q1.x = quatA[1]; q1.y = quatA[2]; q1.z = quatA[3];
           break;

           case Sensor.TYPE_ROTATION_VECTOR:
                SensorManager.getRotationMatrixFromVector(rotationMatrix1,event.values);
                System.arraycopy(event.values, 0, rotationVector, 0, 3);
                SensorManager.getQuaternionFromVector(quat , event.values);
                q2.w = quat[0]; q2.x = quat[1]; q2.y = quat[2]; q2.z = quat[3];
                rotationMatrix2 = getRotationMatrixFromQuaternion(q2);
                rotationResult =  matrixMultiplication(accel,rotationMatrix2);
                //You can  use rotationMatrix1 or rotationMatrix2  

             break;
//Accel Data rotated as per earth frame of reference 
//rotationResult[0]; 
//rotationResult[1];
//rotationResult[2];

        }

    private float[] getRotationMatrixFromQuaternion(Quaternion q22) {
        // TODO Auto-generated method stub
        float [] q = new float[4];
        float [] result = new float[9];
        q[0] = q22.w;
        q[1] = q22.x;
        q[2] = q22.y;
        q[3] = q22.z;

        result[0] = q[0]*q[0] + q[1]*q[1] - q[2]*q[2] -q[3]*q[3];
            result[1] = 2 * (q[1]*q[2] - q[0]*q[3]);
            result[2] = 2 * (q[1]*q[3] + q[0]*q[2]);

            result[3] = 2 * (q[1]*q[2] + q[0]*q[3]);
            result[4] = q[0]*q[0] - q[1]*q[1] + q[2]*q[2] - q[3]*q[3];
            result[5] = 2 * (q[2]*q[3] - q[0]*q[1]);

            result[7] = 2 * (q[2]*q[3] + q[0]*q[1]);
            result[6] = 2 * (q[1]*q[3] - q[0]*q[2]);
        result[8] = q[0]*q[0] - q[1]*q[1] - q[2]*q[2] + q[3]*q[3];

        return result;
    }

 private float[] matrixMultiplication(float[] A, float[] B) {
        float[] result = new float[3];

        result[0] = A[0] * B[0] + A[1] * B[1] + A[2] * B[2];
        result[1] = A[0] * B[3] + A[1] * B[4] + A[2] * B[5];
        result[2] = A[0] * B[6] + A[1] * B[7] + A[2] * B[8];

        return result;
    }
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!