scenekit

FaceTracking in ARKit – How to display the “lookAtPoint” on the screen

你离开我真会死。 提交于 2020-06-10 19:21:08
问题 The ARFaceTrackingConfiguration of ARKit places ARFaceAnchor with information about the position and orientation of the face onto the scene. Among others, this anchor has the lookAtPoint property that I'm interested in. I know that this vector is relative to the face. How can I draw a point on the screen for this position, meaning how can I translate this point's coordinates? 回答1: .lookAtPoint instance property is for direction's estimation only. Apple documentation says: .lookAtPoint is a

FaceTracking in ARKit – How to display the “lookAtPoint” on the screen

心不动则不痛 提交于 2020-06-10 19:13:48
问题 The ARFaceTrackingConfiguration of ARKit places ARFaceAnchor with information about the position and orientation of the face onto the scene. Among others, this anchor has the lookAtPoint property that I'm interested in. I know that this vector is relative to the face. How can I draw a point on the screen for this position, meaning how can I translate this point's coordinates? 回答1: .lookAtPoint instance property is for direction's estimation only. Apple documentation says: .lookAtPoint is a

ARKit – Viewport Size vs Real Screen Resolution

℡╲_俬逩灬. 提交于 2020-05-28 07:45:09
问题 I am writing an ARKit app that uses ARSCNView hitTest function. Also the app sends captured images to the server for some analysis. I notices when I do: let viewportSize = sceneView.snapshot().size let viewSize = sceneView.bounds.size then the first one is twice as large as the second one. The questions are: 1.Why there is a difference? 2.What "size" (e.g. coordinates) is used in hitTest? 回答1: Why there is a difference? Let's explore some important display characteristics of your iPhone 7 : a

ARKit – Viewport Size vs Real Screen Resolution

淺唱寂寞╮ 提交于 2020-05-28 07:44:40
问题 I am writing an ARKit app that uses ARSCNView hitTest function. Also the app sends captured images to the server for some analysis. I notices when I do: let viewportSize = sceneView.snapshot().size let viewSize = sceneView.bounds.size then the first one is twice as large as the second one. The questions are: 1.Why there is a difference? 2.What "size" (e.g. coordinates) is used in hitTest? 回答1: Why there is a difference? Let's explore some important display characteristics of your iPhone 7 : a

ARFaceTrackingConfiguration: How to distinguish pictures from real faces?

守給你的承諾、 提交于 2020-05-10 21:00:15
问题 we have several apps in the Store that use ARFaceTrackingConfiguration to detect the users face in iOS devices with FaceID cameras. As you might have seen, ARKit will also track picture of faces you put in front of your iPad Pro/iPhoneX, as if they were faces. E.g. take a picture from one of our apps (to replicate one can download&run Apples example app for ARFaceTrackingConfiguration): Now I have noticed that internally ARKit treats real faces differently then it does pictures of faces.

ARFaceTrackingConfiguration: How to distinguish pictures from real faces?

不羁的心 提交于 2020-05-10 20:56:36
问题 we have several apps in the Store that use ARFaceTrackingConfiguration to detect the users face in iOS devices with FaceID cameras. As you might have seen, ARKit will also track picture of faces you put in front of your iPad Pro/iPhoneX, as if they were faces. E.g. take a picture from one of our apps (to replicate one can download&run Apples example app for ARFaceTrackingConfiguration): Now I have noticed that internally ARKit treats real faces differently then it does pictures of faces.

SwiftUI - how to add a Scenekit Scene

泄露秘密 提交于 2020-05-10 03:38:25
问题 How can I add a Scenekit Scene to a SwiftUI view? I tried the following Hello World, using the standard Ship Scene example... import SwiftUI import SceneKit struct SwiftUIView : View { var body: some View { ship() Text("hello World") } But it didn't work: 回答1: In order for this to work, your SwiftUI View must conform to UIViewRepresentable . There's more info about that in Apple's tutorial: Interfacing with UIKit. import SwiftUI struct SwiftUIView : UIViewRepresentable { func makeUIView

didBeginContact delegate method not firing for ARKit collision detection

妖精的绣舞 提交于 2020-04-30 09:20:54
问题 I can't get the didBeginContact method to fire, I have been trying for a while and I can't spot the error, could use a fresh set of eyes: - (void)viewDidLoad { [super viewDidLoad]; self.lastRender = nil; self.accelX = 0.0; self.accelY = 0.0; self.accelZ = 0.0; self.isLooping = TRUE; self.tripWire = TRUE; self.lastPaddleNode = [[SCNNode alloc] init]; self.paddleNode = [[SCNNode alloc] init]; SCNPlane* paddlePlane = [SCNPlane planeWithWidth:0.067056 height:0.138176]; self.paddleNode.geometry =

Playing GIF using GIFU library in SceneKit causes app UI freeze any solution? UIView Animated being called from a background thread

纵然是瞬间 提交于 2020-04-18 12:32:24
问题 I'm doing image tracking using AR Kit, once image is detected i'm playing GIF with GIFUhttps://github.com/kaishin/Gifu library. This is successful with below code. In VC i added GIFImageView as below: var imageView = GIFImageView(frame: CGRect(x: 0, y: 0, width: 600, height: 600)) And in ARSceneView delegate didAdd node method is below: func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { DispatchQueue.main.async { self.instructionLabel.isHidden = true }

Playing GIF using GIFU library in SceneKit causes app UI freeze any solution? UIView Animated being called from a background thread

我是研究僧i 提交于 2020-04-18 12:31:38
问题 I'm doing image tracking using AR Kit, once image is detected i'm playing GIF with GIFUhttps://github.com/kaishin/Gifu library. This is successful with below code. In VC i added GIFImageView as below: var imageView = GIFImageView(frame: CGRect(x: 0, y: 0, width: 600, height: 600)) And in ARSceneView delegate didAdd node method is below: func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { DispatchQueue.main.async { self.instructionLabel.isHidden = true }