core-motion

iphone - core motion range of yaw, pitch and roll

穿精又带淫゛_ 提交于 2019-11-29 02:28:41
I don't have an iPhone 4 with me right now and I am trying to find a documentation that shows the ranges of yaw, pitch and roll and the correspondent positions of the device. I know that the accelerometer varies from -1 to +1 but on my tests yesterday on my iPhone, showed me that the roll varies from -M_PI to +M_PI, but yaw and pitch ranges are half of that. Is this correct? Where do I find documentation about these ranges? I don't see it on Apple vague docs. Thanks. This is not a full answer, but in the interest of starting the ball rolling: I'm assuming you are talking about the device

Drifting yaw angle after moving fast

两盒软妹~` 提交于 2019-11-29 00:27:19
In my current project I ran into trouble regarding the quaternion provided by Core Motion's CMAttitude. I put the iPhone 5 (iOS 6.0.1) at a well defined start position. Then I start to move the device quickly around like in a fast pacing game. When I return to the start position after 10-30 seconds the reported yaw angle differ from the start position for 10-20 degrees (most of the time ≈11°). I used the old (and sadly no longer available) Core Motion Teapot sample to validate the effect. The Euler Angles for logging are read directly from CMAttitude: NSLog(@"pitch: %f, roll: %f, yaw: %f",

iPhone - understanding iPhone rotation

落爺英雄遲暮 提交于 2019-11-28 17:13:33
I am banging my head on the wall trying to understand this. See the next picture. Suppose I have an iPhone resting on a table. At this time the rotation readings thru core motion are 0,0,0 for yaw, roll and pitch (picture A). Then I roll it 90 degrees. Now it is sitting on the table with its left side, home button on the right. Now it reads 0,0,0 (picture B). Now I yaw it 180 degrees. It is now sitting with its right side on the table. Home button on the left. It now reads 180,0,0 (picture C). The problem comes if I roll it now. Suppose I roll it -45 degrees. I should be reading 180,-45,0 but

Finding normal vector to iOS device

拥有回忆 提交于 2019-11-28 16:52:38
I would like to use CMAttitude to know the vector normal to the glass of the iPad/iPhone's screen (relative to the ground). As such, I would get vectors like the following: Notice that this is different from orientation, in that I don't care how the device is rotated about the z axis. So if I was holding the iPad above my head facing down, it would read (0,-1,0), and even as I spun it around above my head (like a helicopter), it would continue to read (0,-1,0): I feel like this might be pretty easy, but as I am new to quaternions and don't fully understand the reference frame options for

DeviceMotion relative to world - multiplyByInverseOfAttitude

戏子无情 提交于 2019-11-28 07:42:22
What is the correct way to use CMAttitude:multiplyByInverseOfAttitude? Assuming an iOS5 device laying flat on a table, after starting CMMotionManager with: CMMotionManager *motionManager = [[CMMotionManager alloc]init]; [motionManager startDeviceMotionUpdatesUsingReferenceFrame: CMAttitudeReferenceFrameXTrueNorthZVertical]; Later, CMDeviceMotion objects are obtained: CMDeviceMotion *deviceMotion = [motionManager deviceMotion]; I expect that [deviceMotion attitude] reflects the rotation of the device from True North. By observation, [deviceMotion userAcceleration] reports acceleration in the

Motion Manager is not working in Swift

冷暖自知 提交于 2019-11-28 07:19:14
问题 I am try to use motion manager in Swift but the log inside my update block never prints. var motionManager: CMMotionManager = CMMotionManager() motionManager.accelerometerUpdateInterval = 0.01 println(motionManager.deviceMotionAvailable) // print true println(motionManager.deviceMotionActive) // print false motionManager.startDeviceMotionUpdatesToQueue(NSOperationQueue.currentQueue(), withHandler:{ deviceManager, error in println("Test") // no print }) println(motionManager.deviceMotionActive

iOS detect movement of user

徘徊边缘 提交于 2019-11-28 07:02:27
I want to create a simple app that draws a simple line on screen when I move my phone on the Y-axis from a start point to end point, for example from point a(0,0) to point b(0, 10) please help demo : John Fontaine You need to initialize the motion manager and then check motion.userAcceleration.y value for an appropriate acceleration value (measured in meters / second / second). In the example below I check for 0.05 which I've found is a fairly decent forward move of the phone. I also wait until the user slows down significantly (-Y value) before drawing. Adjusting the device

Swift watchOS 2 - CMSensorDataList

点点圈 提交于 2019-11-28 03:59:03
问题 Short: I don't know how to extract the CMRecordedAccelerometerData from the CMSensorDataList after getting one from the CMSensorRecorder. Apple isn't providing any documentation, yet. Maybe someone has a hint for me? ;) func startMovementDetection(){ var accDataList = self.cmSensorRecorder!.accelerometerDataFrom(self.startDate, to: NSDate()) as CMSensorDataList CMRecordedAccelerometerData() //that's the class i want to extract from CMSensorDataList } Okay, problem solved with this one here:

iphone - core motion (relative rotation)

筅森魡賤 提交于 2019-11-28 01:17:05
问题 Is there a way to obtain a relative rotation from core motion? What I need is: how much it rotated in one axis and which direction (+ sign = anti-clockwise, - = clockwise, according to the right-hand rule). I have found the property rotationRate, but I am now sure how I would extract the angle out of it, as this is giving me radians per second. I have done all kind of stuff on the last days but nothing is giving me stable values. I have tried to do a timed sample of core motion data, using a

Using quaternion instead of roll, pitch and yaw to track device motion

人走茶凉 提交于 2019-11-27 20:18:22
Please bear with my long question, I am trying to make it as clear as possible. What i am trying to do is, get the attitude(roll pitch and yaw) when a picture is taken using camera and then save the attitude values to nsuserdefaults. After saving, the orientation is changed and then try to bring the phone to the same attitude the picture was taken by constantly comparing the attitude values(saved and current). For the purpose of interface, the user interface has 3 dots (one for each attitude parameter)on screen which guide to the correct orientation the picture was taken. On reaching the