arkit

How do I rotate an arkit 4x4 matrix around Y using Apple's SIMD library?

放肆的年华 提交于 2019-12-06 08:58:31
I am trying to implement some code based on an ARKit demo where someone used this helper function to place a waypoint let rotationMatrix = MatrixHelper.rotateAboutY( degrees: bearing * -1 ) How can I implement the .rotateAboutY function using the SIMD library and not using GLKit? To make it easier, I could start from the origin point. I'm not too handy with the matrix math so a more basic explanation would be helpful. The rotation around Y matrix is: | cos(angle) 0 sin(angle)| | 0 1 0 | |-sin(angle) 0 cos(angle)| Rotation counter-clockwise around Y: |cos(angle) 0 -sin(angle)| | 0 1 0 | |sin

ARKit billboarding effect with SceneKit

a 夏天 提交于 2019-12-06 07:28:11
I am looking to add a billboarding effect that is similar to this application: https://twitter.com/marpi_/status/897130955105644544 I would like SCNodes that use SCNText geometry to always face the camera. I have attempted with out success: SCNLookAtConstraint with sceneView.pointOfView as the target, but this rotates the node to face away from the camera, resulting in backwards text, and unable to change the nodes position or euler angle. Out of the box, an SKLabelNode will always face the camera in ARKit, which is exactly what I want, except using SCNText. You were almost there! Just modify

ARKit set ARAnchor transform based on touch location

末鹿安然 提交于 2019-12-06 06:46:55
问题 I'm playing around with the AR starter app on XCode 9 where anchors are created in a scene on tap: override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { guard let sceneView = self.view as? ARSKView else { return } // Create anchor using the camera’s current position if let currentFrame = sceneView.session.currentFrame { // Create a transform with a translation of 0.2 meters in front // of the camera var translation = matrix_identity_float4x4 translation.columns.3.z = -0

ARKit: Reproducing the Project Point function

北城余情 提交于 2019-12-06 06:30:44
问题 I'm attempting to reproduce the ARCamera's project point function, but for some reason the values are not matching up properly. I am taking the ARCamera's projection matrix and view matrix and applying basic CG perspective transform math, (PV) * p, but the NDC values do not match the pixel values given from the ARCamera's project point function. Any ideas? Am I forgetting something? Some more detail: Basically, I'm trying to take an ARFrame a the click of a button, and then trying to

How to use environment map in ARKit?

和自甴很熟 提交于 2019-12-06 06:20:12
问题 ARKit 2.0 added a new class named AREnvironmentProbeAnchor. Reading it's instructions, it seems that ARKit can automatically collect environment texture (cubemap?). I believe that we can now create some virtual objects reflecting the real environment. But I am still not clear how this work, particularly how the environment texture is generated. Does anyone have simple sample code demonstrating this cool feature? 回答1: Its pretty simple to implement environmentTexturing in your AR project. Set

How to put ARSCNView in Tabview controller without freezing the ARSession?

白昼怎懂夜的黑 提交于 2019-12-06 03:57:40
问题 I am trying to implement ARKit of iOS 11 beta in my app(Tabbed application). But as said in ARKit Session Paused and Not Resuming thread, whenever i change the tab to another view controller and come back, the ARSession is getting freezed and not resuming. Is it possible to implement ARSCNView in a tabbed application so that if you come back i can resume the ARSession? If so how to do it? 回答1: Yes, you can: override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) let

ARKit Demo Crashing on iPhone 6/iPhone 6 Plus

醉酒当歌 提交于 2019-12-05 23:41:14
问题 I'm working with ARKit Feature , with recent major iOS release, but I'm getting a crash with error failed assertion MTLRenderPassDescriptor: MTLStoreActionMultisampleResolve store action for the depth attachment is not supported by device I already have iOS11 beta, installed in my iPhone device. 回答1: As all answers above this is a hardware constraint to A9 chips. Anyway it is a good practice to addAdding ARKit to UIRequiredDeviceCapabilities on Info.plist will give you a better feedback

ARKit: How to tell if user's face is parallel to camera

故事扮演 提交于 2019-12-05 22:27:46
In my Swift / ARKit / SceneKit project, I need to tell if the user's face in front-facing camera is parallel to the camera. I was able to tell horizontal parallel by comparing the left and right eyes distance (using faceAnchor.leftEyeTransform and the worldPosition property) from the camera. But I am stuck on vertical parallel. Any ideas, how to achieve that? Assuming you are using ARFaceTrackingConfiguration in your app, you can actually retrieve the transforms of both the ARFaceAnchor and the camera to determine their orientations. You can get a simd_float4x4 matrix of the head's orientation

ARKit - place a SCNPlane between 2 vector points on a plane in Swift 3 [duplicate]

人走茶凉 提交于 2019-12-05 20:15:33
This question already has an answer here: Scenekit shape between 4 points 1 answer Similar to some of the measuring apps you can see being demonstrated in ARKit, I have a plane with 2 marker nodes on it and a line drawn between the 2. What I need though is an SCNPlane between the 2. So, if your original was the floor and you put a marker either side of a wall, you could represent the physical wall with a SCNPlane in your AR world. Currently I'm placing the line with the following code: let line = SCNGeometry.lineFrom(vector: firstPoint.position, toVector: secondPoint.position) let lineNode =

ARKit estimatedVerticalPlane hit test get plane rotation

别等时光非礼了梦想. 提交于 2019-12-05 18:39:09
I am using ARKit to detect walls at runtime, I use a hit test of type .estimatedVerticalPlane when some point of the screen is touched. I am trying to apply Y rotation to node corresponding to the detected plane orientation. I want to compute the rotation in : private func computeYRotationForHitLocation(hitTestResult: ARHitTestResult) -> Float { guard hitTestResult.type == .estimatedVerticalPlane else { return 0.0 } // guard let planeAnchor = hitTestResult.anchor as? ARPlaneAnchor else { return 0.0 } // guard let anchoredNode = sceneView.node(for: planeAnchor) else { return 0.0 } let