arkit

Save ARFaceGeometry to OBJ file

夙愿已清 提交于 2020-01-02 11:03:10
问题 In an iOS ARKit app, I've been trying to save the ARFaceGeometry data to an OBJ file. I followed the explanation here: How to make a 3D model from AVDepthData?. However, the OBJ isn't created correctly. Here's what I have: func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { guard let faceAnchor = anchor as? ARFaceAnchor else { return } currentFaceAnchor = faceAnchor // If this is the first time with this anchor, get the controller to create content. //

How to render a UIView with transparent background on an SCNPlane in ARKit?

风格不统一 提交于 2020-01-02 04:16:09
问题 My UIView has a UIColor.clear background. I am instantiating the View Controller from a storyboard. When I set SCNPlane geometry's diffuse contents to the viewcontroller's view, the Transparent background appears solid white on the plane. here is how I set it let material = SCNMaterial() material.diffuse.contents = viewController.view planeGeometry.materials = [material] I can see the view, just the background is not transparent. I saw suggestion on other Stack overflow posts where they

ARKit project point with previous device position

荒凉一梦 提交于 2020-01-01 07:22:22
问题 I'm combining ARKit with a CNN to constantly update ARKit nodes when they drift. So: Get estimate of node position with ARKit and place a virtual object in the world Use CNN to get its estimated 2D location of the object Update node position accordingly (to refine it's location in 3D space) The problem is that #2 takes 0,3s or so. Therefore I can't use sceneView.unprojectPoint because the point will correspond to a 3D point from the device's world position from #1. How do I calculate the 3D

How to improve People Occlusion in ARKit 3.0

爷,独闯天下 提交于 2020-01-01 03:04:07
问题 We are working on a demo app using people occlusion in ARKit. Because we want to add videos in the final scene, we use SCNPlane s to render the video using a SCNBillboardConstraint to ensure they are facing the right way. These videos are also partially transparent, using a custom shader on the SCNMaterial we apply (thus playing 2 videos at once). Now we have some issues where the people occlusion is very iffy (see image). The video we are using to test is a woman with a dark pants and skirt

Check whether the ARReferenceImage is no longer visible in the camera's view

非 Y 不嫁゛ 提交于 2019-12-31 08:51:33
问题 I would like to check whether the ARReferenceImage is no longer visible in the camera's view. At the moment I can check if the image's node is in the camera's view, but this node is still visible in the camera's view when the ARReferenceImage is covered with another image or when the image is removed. func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) { guard let node = self.currentImageNode else { return } if let pointOfView = sceneView.pointOfView { let isVisible =

How can I get Camera Calibration Data on iOS? aka AVCameraCalibrationData

a 夏天 提交于 2019-12-31 04:20:25
问题 As I understand it, AVCameraCalibrationData is only available over AVCaptureDepthDataOutput. Is that correct? AVCaptureDepthDataOutput on the other hand is only accessible with iPhone X front cam or iPhone Plus back cam, or am I mistaken? What I am trying to do is to get the FOV of an AVCaptureVideoDataOutput SampleBuffer. Especially, it should match the selected preset (full HD, Photo etc.). 回答1: You can get AVCameraCalibrationData only from depth data output or photo output. However, if all

How to create a border for SCNNode to indicate its selection in iOS 11 ARKit-Scenekit?

假如想象 提交于 2019-12-30 04:24:08
问题 How to draw a border to highlight a SCNNode and indicate to user that the node is selected? In my project user can place multiple virtual objects and user can select any object anytime. Upon selection i should show the user highlighted 3D object. Is there a way to directly achieve this or draw a border over SCNNode? 回答1: You need to add a tap gesture recognizer to the sceneView . // add a tap gesture recognizer let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap(

Convert matrix_float4x4 to x y z space

半腔热情 提交于 2019-12-29 18:50:10
问题 I'm using ARKit and trying to get the position of the camera as a rotation and (x,y,z) coordinates in real world space. All I can manage to get is a matrix_float4x4 , which I don't really understand, and euler angles only show the rotation. Here's what I currently have: let transform = sceneView.session.currentFrame?.camera.transform let eulerAngles = sceneView.session.currentFrame?.camera.eulerAngles Here's the output I'm getting: eulerAngles: float3(-0.694798, -0.0866041, -1.68845)

Convert matrix_float4x4 to x y z space

痴心易碎 提交于 2019-12-29 18:50:09
问题 I'm using ARKit and trying to get the position of the camera as a rotation and (x,y,z) coordinates in real world space. All I can manage to get is a matrix_float4x4 , which I don't really understand, and euler angles only show the rotation. Here's what I currently have: let transform = sceneView.session.currentFrame?.camera.transform let eulerAngles = sceneView.session.currentFrame?.camera.eulerAngles Here's the output I'm getting: eulerAngles: float3(-0.694798, -0.0866041, -1.68845)

ARKit – Drag a node along a specific axis (not on a plane)

China☆狼群 提交于 2019-12-29 04:57:11
问题 I am trying to drag a node exactly where my finger is on the screen on the Y -axis. But I don't find any way to do it, here is my current code. Do you have any idea? var movedObject:SCNNode? @objc func handlePan(_ recognizer: UIPanGestureRecognizer) { if recognizer.state == .began { let tapPoint: CGPoint = recognizer.location(in: sceneView) let result = sceneView.hitTest(tapPoint, options: nil) if result.count == 0 { return } let hitResult: SCNHitTestResult? = result.first movedObject =