core-motion

Do the sensor fusion algorithms of Core Motion take advantage of the Kalman filter?

梦想与她 提交于 2019-12-06 23:05:56
问题 Do the sensor fusion algorithms of Core Motion take advantage of the Kalman filter? 回答1: Update on June 22, 2016 According to the documentation provided by Apple, The processed device-motion data provided by Core Motion’s sensor fusion algorithms gives the device’s attitude, rotation rate, calibrated magnetic fields, the direction of gravity, and the acceleration the user is imparting to the device. That is, some sort of sensor fusion algorithm has been provided by now. I cannot tell from

iOS core motion detect forard / backward tilt

流过昼夜 提交于 2019-12-06 15:24:45
I am using the iOS core motion framework to detect if the device is tilted forward or backwards. See image for details: http://i.stack.imgur.com/2Ojw5.jpg Using the pitch value a can detect this movement but I can not distinguish between forward AND backward. More details: I try to detect if there is a movement (tilting forward and backward) in either the forward area OR backward area (see updated sketch). The problem with the pitch is that it starts with a value of about 1.6 if the device is in an upright position. And the value decreases the same when I am tilting it towards a horizontal

Detect physical movement of iPhone/Apple Watch

爱⌒轻易说出口 提交于 2019-12-06 13:37:01
I'm trying to detect the movement (to the right or left) performed by users. We assume that the user starts with his arm extended in front of him and then moves his arm to the right or to the left (about 90 degrees off center). I've integrated CMMotionManager and want to understand detecting direction via startAccelerometerUpdatesToQueue and startDeviceMotionUpdatesToQueue methods. Can anyone suggest how to implement this logic on an iPhone and then on an Apple Watch? Apple provides watchOS 3 SwingWatch sample code demonstrating how to use CMMotionManager() and startDeviceMotionUpdates(to:) to

Madgwick's sensor fusion algorithm on iOS

喜欢而已 提交于 2019-12-06 09:33:00
i'm trying to run Madgwick's sensor fusion algorithm on iOS. Since the code is open source i already included it in my project and call the methods with the provided sensor values. But it seems, that the algorithm expects the sensor measurements in a different coordinate system. The Apple CoreMotion Sensor System is given on the right side, Madgewick's on the left. Here is the picture of the different coordinate systems . Both systems follow the right hand rule. For me it seems like there is a 90 degree rotation around the z axis. But this didn't work. I also tried to flip x and y (and invert

Compute rate of change of orientation(angular measurement) along y axis?

安稳与你 提交于 2019-12-06 06:14:24
I want to compute rate of change of orientation of the iPhone along y axis. 1. Initially i need to define the reference as y axis, 2. Then measure the rate of change of orientation(angular measurement) from the defined reference. Does CMAttitude provides a reliable angular measurements to implement this? Or Can i use rotational matrix or integrate gyroscope data(I implement this method, but it's not going to work due to the drift of the gyroscope). So please suggest me a reliable method to get this done? Thank you in advance! First, just to clarify: Rotation rates are not measured "from" any

How to get magnetometer data using swift?

谁说胖子不能爱 提交于 2019-12-06 05:54:29
问题 I am trying to get the magnetic field data from my iphone 6 by using CoreMotion. I had no problems accessing the raw data with the following code: if available { motionMangager.magnetometerUpdateInterval = updateInterval motionMangager.startMagnetometerUpdatesToQueue(queue, withHandler: { (data, error: NSError!) -> Void in println("x: \(data.magneticField.x), y: \(data.magneticField.y), z: \(data.magneticField.z)") }) } BUT: I need the derived data by using an device motion instance. So I did

Push method for core motion and frequency of Accelerometer/Gyroscope Data

为君一笑 提交于 2019-12-06 04:35:56
When push method is used to get accelerometer/gyroscope/device motion data, unfortunately the gyroscope and the device motion maximum frequency cannot exceed 72Hz on average (in fact the data is not periodic at all either). Even worse, if only the gyroscope data is recorded (without starting device motion update service: i.e. using only [motionManager startGyroUpdatesToQueue:opQ withHandler:gyroHandler]), then the gyroscope data frequency drops to 52Hz on average. Has anyone tried to obtain gyroscope data at 100Hz using pull method? Can we reach the maximum 100Hz throughput with the "pull

iOS: Synchronizing frames from camera and motion data

 ̄綄美尐妖づ 提交于 2019-12-06 04:10:32
问题 I'm trying to capture frames from camera and associated motion data. For synchronization I'm using timestamps. Video and motion is written to a file and then processed. In that process I can calculate motion-frames offset for every video. Turns out motion data and video data for same timestamp is offset from each other by different time from 0.2 sec up to 0.3 sec. This offset is constant for one video but varies from video to video. If it was same offset every time I would be able to subtract

Compare device 3D orientation with the sun position

ε祈祈猫儿з 提交于 2019-12-06 04:04:11
问题 I am working on an app that requires the user to aim his iPhone to the sun in order to trigger a special event. I can retrieve the device 3D orientation quaternion based on the gyroscope and CoreMotion framework, from this I can get the yaw, roll and pitch angles. I can also compute the sun's azimuth and zenith angle based on the current date and time (GMT) and the latitude and longitude. What I am trying to figure out next is how to compare these two sets of values (phone orientation and sun

CMDeviceMotion userAcceleration drift

╄→尐↘猪︶ㄣ 提交于 2019-12-06 03:55:58
I'm getting the acceleration data using the -[CMDeviceMotion userAcceleration] I've noticed one interesting thing: I always get a small bias on the Z axis. It is about 0.0155 (with variance of 0.002). While on other axes the average values are near 0. I'm testing this with iPod Touch 4G (and it is just laying on the table during testing). The question is: where this bias is from and is it device specific? I noticed similar values although CoreMotion tries to eliminate bias. If you rotate your device so that x (or y) is parallele to gravity you will probably see the bias in x direction. Using