core-motion

watchOS2 - CMSensorRecorder

吃可爱长大的小学妹 提交于 2019-12-06 00:26:46
I want to use the historical accelerometer data from the Apple Watch and my accDataList is always nil. I instantiated CMSensorRecorder in the init function of the class. Does someone had this problem before? func startMovementDetection(){ self.cmSensorRecorder?.recordAccelerometerFor(self.recorderDuration) self.startDate = NSDate() } func extractHistoricalAccelerometerData(){ var accDataList = self.cmSensorRecorder!.accelerometerDataFrom(self.startDate, to: NSDate()) NSLog("AccDataList : \(accDataList)") if accDataList != nil { accDataList = accDataList as CMSensorDataList for accData in

Swift 3.0 how to use startDeviceMotionUpdates(to: withHandler)?

廉价感情. 提交于 2019-12-05 12:50:32
Swift 3.0 was released alongside Xcode 8.0 and apparently a lot has changed. I'm very unfamiliar with the new syntax in swift. Can someone help me out? I'm trying to figure out what goes in motionManager.startDeviceMotionUpdates( to: OperationQueue.current()!, withHandler: ) after the "withHandler:" I am trying to get my SceneKit program be able to utilize the accelerometer to determine the orientation of an SCNNode platform. I am also fairly new to swift (about 5 days into programming in Swift) so if there's something fundamental I am messing up, let me know. Reference You must pass the block

CMPedometer StepCounting not Available

♀尐吖头ヾ 提交于 2019-12-05 10:52:10
My code is: if ([CMPedometer isStepCountingAvailable]) { self.pedometer = [[CMPedometer alloc] init]; } else { NSLog(@"Step counting is not available on this device!"); [SVProgressHUD showErrorWithStatus:@"Step counting is not available on this device!"]; } When i run it on iOS8 and later devices, it says: Step counting is not available on this device! How can i make it available for step counting ? Your code is correct and it yields the expected result. The iPhone 5 does not have the hardware (the Apple M7 chip) to track steps, so step counting is not available. You need at least an iPhone 5s

iOS CoreMotion - Calculating Height Difference

安稳与你 提交于 2019-12-05 04:23:32
问题 Is it possible to calculate the difference in height when an iPhone moves using the data from CoreMotion? For example, the initial position of the iPhone is saved, then the iPhone is moved which results in a new position. I am interested in detecting if the height of the iPhone moved from its initial position. I don't need absolute units, just a relative value indicating what percentage the height has changed from its original height. let manager = CMMotionManager() // Save initial attitude

Do the sensor fusion algorithms of Core Motion take advantage of the Kalman filter?

笑着哭i 提交于 2019-12-05 03:14:19
Do the sensor fusion algorithms of Core Motion take advantage of the Kalman filter? Ali Update on June 22, 2016 According to the documentation provided by Apple, The processed device-motion data provided by Core Motion’s sensor fusion algorithms gives the device’s attitude, rotation rate, calibrated magnetic fields, the direction of gravity, and the acceleration the user is imparting to the device. That is, some sort of sensor fusion algorithm has been provided by now. I cannot tell from this piece of information whether that sensor fusion algorithm is the Kalman filter, or something equally

Get current iOS device orientation even if device's orientation locked

旧巷老猫 提交于 2019-12-05 02:29:30
问题 I want to get the current iOS device orientation even if device's orientation is locked. (Just like iOS Camera app) I want to detect Portrait, Landscape Left and Landscape Right at least. UIDeviceOrientation and UIInterfaceOrientation do not seem to work when orientation is locked. In this case, I think that we will use CoreMotion. How do i logic it in swift4? 回答1: Declare motion manger with core motion var orientationLast = UIInterfaceOrientation(rawValue: 0)! var motionManager:

Utilizing the M7 chip in the iPhone5S [closed]

限于喜欢 提交于 2019-12-04 12:55:18
Closed. This question is off-topic. It is not currently accepting answers. Learn more . Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed last year . I was wondering if anyone can point me in the right direction with regard to utilzing the M7 chip in the iPhone5S and above. I was wondering if it's just automagically used when you use general core motion APIs or is there a specific set of APIs to use? Finally I noted whilst reading some articles online that it keeps some historical movement data, anyone know how to access this or can anyone point me

CMDeviceMotion userAcceleration is upside down?

[亡魂溺海] 提交于 2019-12-04 10:47:24
I'm seeing some unexpected readings from the userAcceleration field in CMDeviceMotion. When I look at the raw accelerometer data from CMAccelerometerData, I see that if the iPhone is flat on a table the reading is 1G straight down (1G in -Z axis) and if I drop the iphone (on a soft surface of course) then the acceleromtere reading goes to zero as expected. That's all fine. When I instead use the CMDeviceMotion class, the userAcceleration reading is zero as expected when the iPhone is flat on table. Again this is fine. But when I drop the iPhone and read the CMDeviceManager userAcceleration,

Compare device 3D orientation with the sun position

假如想象 提交于 2019-12-04 10:24:43
I am working on an app that requires the user to aim his iPhone to the sun in order to trigger a special event. I can retrieve the device 3D orientation quaternion based on the gyroscope and CoreMotion framework, from this I can get the yaw, roll and pitch angles. I can also compute the sun's azimuth and zenith angle based on the current date and time (GMT) and the latitude and longitude. What I am trying to figure out next is how to compare these two sets of values (phone orientation and sun position) to accurately detect the alignment of the device with the sun. Any ideas on how to achieve

iOS: Synchronizing frames from camera and motion data

為{幸葍}努か 提交于 2019-12-04 09:03:39
I'm trying to capture frames from camera and associated motion data. For synchronization I'm using timestamps. Video and motion is written to a file and then processed. In that process I can calculate motion-frames offset for every video. Turns out motion data and video data for same timestamp is offset from each other by different time from 0.2 sec up to 0.3 sec. This offset is constant for one video but varies from video to video. If it was same offset every time I would be able to subtract some calibrated value but it's not. Is there a good way to synchronize timestamps? Maybe I'm not