sensor

Android: Problems calculating the Orientation of the Device

送分小仙女□ 提交于 2019-11-30 07:20:35
i'am trying to build a simple Augmented Reality App, so I start working with sensor Data. According to this thread ( Android compass example ) and example ( http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html ), the calculation of the orientation using the Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD doesn't really fit. So I'm not able to get "good" values. The azimut values doesn't make any sense at all, so if I just move the Phone upside the value changes extremly. Even if I just rotate the phone, the values doesn't represent the phones orientation. Has

Using getRotationMatrix and getOrientation in Android 2.1

Deadly 提交于 2019-11-30 07:14:59
I've been having issues with this for far too long. This code should output dx,dy,dz for the accelerometer, and a running total of the dx. It should also output azimuth, pitch, and roll. I've used the information given here , but to no avail. This code does not correctly output pitch, azimuth, or roll. It outputs 0.0, -0.0, -0.0 for the last three textviews, respectively. switch (event.sensor.getType()) { case Sensor.TYPE_ACCELEROMETER: accelerometerValues = event.values.clone(); case Sensor.TYPE_MAGNETIC_FIELD: geomagneticMatrix = event.values.clone(); sensorReady = true; break; default:

Get device angle by using getOrientation() function

不羁岁月 提交于 2019-11-30 07:13:26
I was using Sensor.TYPE_ORIENTATION to determine current angle of device but TYPE_ORIENTATION is deprecated on API version 8 . In SensorManager manual it refers to getOrientation() function in order to use TYPE_ORIENTATION . Here is the manual Here is my old code : public void onSensorChanged(SensorEvent event) { Log.d("debug","Sensor Changed"); if (event.sensor.getType()==Sensor.TYPE_ORIENTATION) { Log.d("debug",Float.toString(event.values[0])); float mAzimuth = event.values[0]; float mPitch = event.values[1]; float mRoll = event.values[2]; Log.d("debug","mAzimuth :"+Float.toString(mAzimuth))

How do I use the Google Maps API GPS sensor?

£可爱£侵袭症+ 提交于 2019-11-30 05:59:48
All I've been able to find is how to specify the sensor parameter: http://code.google.com/apis/maps/documentation/v3/#SpecifyingSensor But nowhere does it say how to actually USE it. Isn't the whole point to be able to get the user's current lat/long coordinates through the device GPS, or am I mistaken? That sensor parameter is only there to indicate to Google that you are using a GPS sensor to determine the user's location. Unless using the W3C Geolocation API in browsers that support it, it remains your responsibility to get the latitude and longitude from your GPS device to the user's

Is it possible to calculate velocity by integrating accelerometer data over time?

时光总嘲笑我的痴心妄想 提交于 2019-11-30 05:28:17
问题 I'm wondering if I can use (linear) accelerometer and compass to calculate velocity without using location services. I want to do that by calculating acceleration components by north/west/up axes and integrating them over time. Would that work? 回答1: In theory, you could do this; in practice, no. You would have to assume you know an initial velocity, but assuming you start at 0 velocity isn't such a bad assumption. You would also have to know the actual orientation of the device throughout the

Android: Which thread calls .onSensorChanged?

时光毁灭记忆、已成空白 提交于 2019-11-30 04:56:38
问题 I've read a few discussions about which thread calls various callback methods, for example those associated with Sensors. Most claim that the UI thread calls the callbacks - even when a separate worker thread is involved. Are we CERTAIN about that? Consider this scenario: A separate class implements Runnable and SensorListener. The UI thread (during onCreate) starts the runnable and then goes back to its other business. The now-independent worker thread, in its own class, then registers the

access (faster polling) accelerometer via NativeActivity NDK

你说的曾经没有我的故事 提交于 2019-11-30 03:39:05
I've searched for a tutorial/an answer on polling accelerometer faster with NDK but didnt find solver yet. just found an androiddevelopers documentation here . what i need is polling acceleration about 100 samples per second (100Hz), by default my device (Samsung Galaxy SL i9003 with gingerbread 2.3.5) with default SENSOR_DELAY_FASTEST can only get about 60 samples persecond (60Hz). Therefore i tried to access sensor via NativeActivity with NDK by generating .c files that i try to make based on sensor.h and looper.h: #include <jni.h> #include <string.h> #include <android/sensor.h> #include

How to read serial port data from JavaScript

烈酒焚心 提交于 2019-11-30 00:33:10
I connected an Arduino to my laptop using USB, and I can read the serial data using Processing. Is there any way to get this data in real time into a local webbrowser? For example, a text field that shows the value from the serial port? It does not have to be connected to the internet. The JavaScript version of Processing does not support the following code, which would have been the ideal solution. The Processing code is: myPort = new Serial(this, Serial.list()[0], 9600); // read a byte from the serial port int inByte = myPort.read(); // print it println(inByte); // now send this value

Java based library for sensor data collection

孤街醉人 提交于 2019-11-29 21:56:37
I'm looking for an embeddable Java library that is suitable for collecting real-time streams of sensor data in a general-purpose way. I plan to use this to develop a "hub" application for reporting on multiple disparate sensor streams, running on a JVM based server (will also be using Clojure for this). Key things it needs to have: Interfaces for various common sensor types / APIs. I'm happy to build what I need myself, but it would be nice if some standard stuff comes out of the box. Suitable for "soft real time" usage, i.e. fairly low latency and low overhead. Ability to monitor and manage

Sensor fusioning with Kalman filter

心不动则不痛 提交于 2019-11-29 21:04:35
I'm interested, how is the dual input in a sensor fusioning setup in a Kalman filter modeled? Say for instance that you have an accelerometer and a gyro and want to present the "horizon level", like in an airplane, a good demo of something like this here. How do you actually harvest the two sensors positive properties and minimize the negative? Is this modeled in the Observation Model matrix (usually symbolized by capital H)? Remark: This question was also asked without any answers at math.stackexchange.com Usually, the sensor fusion problem is derived from the bayes theorem. Actually you have