core-motion

Get pitch, roll and yaw relative to geographic north on iOS?

女生的网名这么多〃 提交于 2019-12-19 09:19:11
问题 I see that I can retrieve CMAttitude from a device and from it I can read 3 values which I need (pitch, roll and yaw). As I understand, this CMAttitude object is managed by CoreMotion which is a Sensor Fusion manager for calculating correct results from compass, gyro and accelerometer together (on Android it is SensorManager Class). So my questions are: Are those values (pitch, roll and yaw) relative to the magnetic north and gravity? If above is correct, how can I modify it to give me

NSTimeInterval to unix timestamp

假装没事ソ 提交于 2019-12-19 03:07:19
问题 I'm getting CMDeviceMotion objects from CMMotionManager . One of the properties of the CMDeviceMotion is timestamp, which is expressed as a NSTimeInterval (double). This allows for "sub millisecond" timestamp precision, according to documentation. [motionManager startDeviceMotionUpdatesToQueue:motionQueue withHandler:^(CMDeviceMotion *motion, NSError *error) { NSLog(@"Sample: %d Timestamp: %f ",counter, motion.timestamp); } Unfortunately, NSTimeInterval is calculated since last device boot,

iPhone - What does the gyroscope measures? Can I get an absolute degree measurement in all axis?

房东的猫 提交于 2019-12-18 17:55:10
问题 I am relatively new to iPhone development and I am playing with the gyroscope, using Core Motion. After a few tests, this is my question. What information is exactly the gyroscope measuring? absolute angles? I mean, suppose I hold my phone in portrait, at exactly 90 degrees and start sampling. These may be not the correct values, but suppose that at this position, the gyroscope gives me 0, 0 and 0 degrees for yaw, pitch and roll. Now I throw my iphone in the air and as it goes up it rolls at

Aggregated CMPedometerData (iPhone + Watch total count)

只谈情不闲聊 提交于 2019-12-18 09:47:27
问题 My app collects CMPedometerData both on the iPhone and the Watch . iPhone's steps number (both in real time and historical data) are significantly lower than those recorded by the Watch (witch makes sense, since "on device" steps are actually lower when you always wear the Watch while sometimes leaving the phone on the desk). The fact is, the Watch seems to have the aggregated data, or at least since it's the higher step count the most significant data, so I'd like to have those aggregated /

Aggregated CMPedometerData (iPhone + Watch total count)

对着背影说爱祢 提交于 2019-12-18 09:47:11
问题 My app collects CMPedometerData both on the iPhone and the Watch . iPhone's steps number (both in real time and historical data) are significantly lower than those recorded by the Watch (witch makes sense, since "on device" steps are actually lower when you always wear the Watch while sometimes leaving the phone on the desk). The fact is, the Watch seems to have the aggregated data, or at least since it's the higher step count the most significant data, so I'd like to have those aggregated /

Using quaternion instead of roll, pitch and yaw to track device motion

点点圈 提交于 2019-12-17 15:52:05
问题 Please bear with my long question, I am trying to make it as clear as possible. What i am trying to do is, get the attitude(roll pitch and yaw) when a picture is taken using camera and then save the attitude values to nsuserdefaults. After saving, the orientation is changed and then try to bring the phone to the same attitude the picture was taken by constantly comparing the attitude values(saved and current). For the purpose of interface, the user interface has 3 dots (one for each attitude

Detecting if a user is moving in a car

别来无恙 提交于 2019-12-17 15:22:49
问题 NOTICE : This question was originally posted before Apple introduced motion-detection hardware and associated APIs in the iOS SDK. Answers to this question, however, remain relevant. I'm creating an iPhone iOS app which involves tracking a user's running and / or walking. It is very important that the recorded results of the users runs and walks remain honest. I need a way to catch a user who may be cheating (or accidentally have left the tracker on) when using a car. To check if the user is

SceneKit view is rendered backwards

我们两清 提交于 2019-12-14 04:13:58
问题 I'm attempting my first SceneKit app. My goal is to simulate a view from the surface of the Earth and being able to point the device's camera in any direction and overlay information over the camera view. To start, I'm simply trying to get the SceneKit camera view to match the device's orientation. To verify that it is working as desired, I am adding a bunch of spheres at specific latitude and longitude coordinates. Everything is working except for one important issue. The view is mirrored

Why is this CMDeviceMotionHandler never called by CoreMotion?

社会主义新天地 提交于 2019-12-14 03:56:00
问题 I've included the CoreMotion framework in my project and imported the CoreMotion framework in the header of my view controller: #import <CoreMotion/CoreMotion.h> Then in -viewDidLoad I have this simple test code which I run on an iPhone 4 with iOS 4.3: - (void)viewDidLoad { [super viewDidLoad]; CMMotionManager *motionManager = [[CMMotionManager alloc] init]; [motionManager setDeviceMotionUpdateInterval:0.1]; CMDeviceMotionHandler motionHandler = ^(CMDeviceMotion *motion, NSError *error) {

Retain cycle suspected in closure

余生长醉 提交于 2019-12-14 02:26:34
问题 I suspect that the following function, which I use in my GameScene class in order to manage the accelerometer's input, is keeping my scene from deinitializing when I transition to another scene: class GameScene: SKScene { let motionManager = CMMotionManager() var xAcceleration = CGFloat(0) // Some stuff // override func didMove(to: .... func setupCoreMotion() { motionManager.accelerometerUpdateInterval = 0.2 let queue = OperationQueue() motionManager.startAccelerometerUpdates(to: queue,