core-motion

iPhone collecting CoreMotion data in the background. (longer than 10 mins)

柔情痞子 提交于 2019-11-27 18:36:19
I am trying to collect coreMotion acceleration data in the background for longer than 10 minutes. This must be possible since apps like Sleep Cycle do this. I just want to make sure this is allowed though, since it does not seem to be one of these: Apps that play audible content to the user while in the background, such as a music player app Apps that record audio content while in the background. Apps that keep users informed of their location at all times, such as a navigation app Apps that support Voice over Internet Protocol (VoIP) Apps that need to download and process new content

Detecting if a user is moving in a car

安稳与你 提交于 2019-11-27 17:24:11
NOTICE : This question was originally posted before Apple introduced motion-detection hardware and associated APIs in the iOS SDK. Answers to this question, however, remain relevant. I'm creating an iPhone iOS app which involves tracking a user's running and / or walking. It is very important that the recorded results of the users runs and walks remain honest. I need a way to catch a user who may be cheating (or accidentally have left the tracker on) when using a car. To check if the user is driving or riding in a car I first thought of these two checks, however neither can really determine if

iphone - core motion range of yaw, pitch and roll

泄露秘密 提交于 2019-11-27 16:56:48
问题 I don't have an iPhone 4 with me right now and I am trying to find a documentation that shows the ranges of yaw, pitch and roll and the correspondent positions of the device. I know that the accelerometer varies from -1 to +1 but on my tests yesterday on my iPhone, showed me that the roll varies from -M_PI to +M_PI, but yaw and pitch ranges are half of that. Is this correct? Where do I find documentation about these ranges? I don't see it on Apple vague docs. Thanks. 回答1: This is not a full

In iOS, what is the difference between the Magnetic Field values from the Core Location and Core Motion frameworks?

心已入冬 提交于 2019-11-27 16:52:52
I have two ways of getting the magnetic fields (strength, x, y, and z) using the iOS device's magnetometer. 1) Core Location Used the CLHeading from CLLocationManagerDelegate method locationManager:didUpdateHeading: . This is similar to Apple's Teslameter sample app. 2) Core Motion Used CMMagneticField from CMMotionManager 's magnetometerData.magneticField . Questions: a) What is the difference between the two? I am getting different values from both. I was expecting that they will return the same values. The difference is most notable when I start the app from a resting position (face up in a

Drifting yaw angle after moving fast

隐身守侯 提交于 2019-11-27 15:20:34
问题 In my current project I ran into trouble regarding the quaternion provided by Core Motion's CMAttitude. I put the iPhone 5 (iOS 6.0.1) at a well defined start position. Then I start to move the device quickly around like in a fast pacing game. When I return to the start position after 10-30 seconds the reported yaw angle differ from the start position for 10-20 degrees (most of the time ≈11°). I used the old (and sadly no longer available) Core Motion Teapot sample to validate the effect. The

Receive accelerometer updates in background using CoreMotion framework

ⅰ亾dé卋堺 提交于 2019-11-27 14:33:16
I'm using the following code to get accelerometer data (using CoreMotion framework): CMMotionManager *motionManager = [[CMMotionManager alloc] init]; motionManager.accelerometerUpdateInterval = 1.0 / 60.0; [motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) { NSLog(@"ACCELEROMETER DATA = %@",accelerometerData); }]; When the app is in foreground mode, I'm receiving the log, but when it enters background, I receive the log only when the music in the app is playing. I've added the following to app

SceneKit – Mapping physical 360 rotation

亡梦爱人 提交于 2019-11-27 11:17:40
问题 I am having a hard time mapping device motion (sensor fusion) to SceneKit node rotation. The premise of the problem is as follows, I have sphere, and the camera is positioned to be inside the the sphere such that the geometric center of sphere and camera node's center coincide. What i want to achieve is when i rotate physically around a point, the motion needs to be mapped accurately on the camera as well. The implementaion details are as follows: I have a node, with a sphere as geomertry and

CoreMotion updates in background state

半腔热情 提交于 2019-11-27 10:30:01
With the M7 chip in the latest iOS devices one can get programmatically notified as the user goes from stationary to running, walking, etc using CMMotionActivityManager. Stava and Runkeeper have both used this to auto-pause GPS polling (shut off the GPS antenna) when it detects the user isn't moving via the M7, and then re-enable GPS updates once they are moving again. It is able to do this while the app is in the background state, which is the key here. The issue I run into while duplicating this functionality is that if I turn off GPS updates while my app is in the background I stop

iPhone - understanding iPhone rotation

你说的曾经没有我的故事 提交于 2019-11-27 10:17:47
问题 I am banging my head on the wall trying to understand this. See the next picture. Suppose I have an iPhone resting on a table. At this time the rotation readings thru core motion are 0,0,0 for yaw, roll and pitch (picture A). Then I roll it 90 degrees. Now it is sitting on the table with its left side, home button on the right. Now it reads 0,0,0 (picture B). Now I yaw it 180 degrees. It is now sitting with its right side on the table. Home button on the left. It now reads 180,0,0 (picture C)

Finding normal vector to iOS device

纵饮孤独 提交于 2019-11-27 10:01:24
问题 I would like to use CMAttitude to know the vector normal to the glass of the iPad/iPhone's screen (relative to the ground). As such, I would get vectors like the following: Notice that this is different from orientation, in that I don't care how the device is rotated about the z axis. So if I was holding the iPad above my head facing down, it would read (0,-1,0), and even as I spun it around above my head (like a helicopter), it would continue to read (0,-1,0): I feel like this might be