core-motion

Is there a way to calculate small distances with CoreMotion?

爱⌒轻易说出口 提交于 2020-07-23 11:13:16
问题 Is it possible to calculate small distances with CoreMotion? For example a user moves his iOS device up or down, left and right and facing the device in front of him (landscape). 回答1: EDIT Link as promised... https://www.youtube.com/watch?v=C7JQ7Rpwn2k position stuff starts at about 23 minutes in. His summary... The best thing to do is to try and not use position in your app. There is a video that I will find to show you. But short answer... No. The margin for error is too great and the

How multiplyByInverseOfAttitude (CMAttitude Class ) is implemented?

早过忘川 提交于 2020-01-05 07:45:28
问题 I need to implement the same behaviour used by multiplyByInverseOfAttitude from the CMAttitude iOS class. Please note that I cannot directly use it but I have the right CMAttitude objects. Anyone can point me to the right direction? 回答1: As you are not allowed to create an instance of CMAttitude on your own, the only way is to create a new quaternion and take this for any further calculations. I recommend using your own customised quaternions class instead of the simple struct CMQuaternion. A

CoreMotion Gyroscope apple watch

痞子三分冷 提交于 2020-01-04 03:53:20
问题 I'm trying to get access to the gyroscope of the apple watch. From what I read it is available in watchos 3. Unfortunately I cannot get it to work. It keeps coming back with "Gyro not available" so motionManager.isGyroAvailable is always false. Here is my code. Any help would be appreciated. import WatchKit import Foundation import CoreMotion class InterfaceController: WKInterfaceController { let motionManager = CMMotionManager() override func awake(withContext context: Any?) { super.awake

Madgwick's sensor fusion algorithm on iOS

不打扰是莪最后的温柔 提交于 2020-01-02 18:03:14
问题 i'm trying to run Madgwick's sensor fusion algorithm on iOS. Since the code is open source i already included it in my project and call the methods with the provided sensor values. But it seems, that the algorithm expects the sensor measurements in a different coordinate system. The Apple CoreMotion Sensor System is given on the right side, Madgewick's on the left. Here is the picture of the different coordinate systems. Both systems follow the right hand rule. For me it seems like there is a

iOS 4 core motion attitude in landscape orientation

佐手、 提交于 2020-01-02 06:49:06
问题 I've been trying to rotate my view based on the CMAttitude returned from CMMotionManager specifically the pitch=x and roll=y. I'm using a reference attitude to set my horizon. This works great for portrait mode but the minute i try to do it for a landscape view it goes wrong. As the phone is now rotated 90 ccw I was hoping that coremotion would know landscape was in place and keep the pitch and roll useful. Instead I still have the axis pointing their original way. To try and compensate I

CMPedometer StepCounting not Available

女生的网名这么多〃 提交于 2020-01-02 04:44:15
问题 My code is: if ([CMPedometer isStepCountingAvailable]) { self.pedometer = [[CMPedometer alloc] init]; } else { NSLog(@"Step counting is not available on this device!"); [SVProgressHUD showErrorWithStatus:@"Step counting is not available on this device!"]; } When i run it on iOS8 and later devices, it says: Step counting is not available on this device! How can i make it available for step counting ? 回答1: Your code is correct and it yields the expected result. The iPhone 5 does not have the

CMSampleBuffer from OpenGL for video output with AVAssestWritter

▼魔方 西西 提交于 2020-01-01 11:46:28
问题 I need to get a CMSampleBuffer for the OpenGL frame. I'm using this: int s = 1; UIScreen * screen = [UIScreen mainScreen]; if ([screen respondsToSelector:@selector(scale)]){ s = (int)[screen scale]; } const int w = viewController.view.frame.size.width/2; const int h = viewController.view.frame.size.height/2; const NSInteger my_data_length = 4*w*h*s*s; // allocate array and read pixels into it. GLubyte * buffer = malloc(my_data_length); glReadPixels(0, 0, w*s, h*s, GL_RGBA, GL_UNSIGNED_BYTE,

iOS 7.1: Get Core Motion data (accelerometer, gyroscope) while app is in background

橙三吉。 提交于 2020-01-01 05:22:26
问题 I am wondering how I can keep receiving motion sensor values while the app is in background mode. I realize that there are already several posts out there. For example, I have tried How Nike+ GPS on iPhone receives accelerometer updates in the background? which does not work for me. I have also enabled background modes (location updates at the moment) in my App-Info.plist. Are there any working examples out there? Also, if possible, I would not want to implement some of the hacks, e.g. play

explanation of iOS rotation matrix

三世轮回 提交于 2020-01-01 03:14:14
问题 I am trying to work with 3D rotations but I don't seem to get the idea how the framework does the calculations. For example I get this data: yaw -1.010544 pitch 0.508249 roll 1.128918 Then I print the corresponding rotation matrix 0.599901 -0.128495 -0.789689 0.740043 0.464230 0.486649 0.304065 -0.876344 0.373584 After reading the API and the wiki, I am pretty sure there must be a general way to create a rotationmatrix out of euler angles. I tried all of these here, no results. Do I miss

What does Core Motion error 102 mean?

感情迁移 提交于 2019-12-30 20:31:48
问题 I use Core Motion's sensor fusing to get north oriented motion updates: [motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXTrueNorthZVertical toQueue:motionQueue withHandler:motionHandler] In a very rare case that can be reproduced only on selected customer devices (iPhone 4S running iOS 6.0.2) I receive this error in the motionHandler: Error Domain=CMErrorDomain Code=102 "The operation couldn’t be completed. (CMErrorDomain error 102.) Also, it seems I don't