arkit

What is the real benefit of using Raycast in ARKit and RealityKit?

浪尽此生 提交于 2020-01-23 14:55:19
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple

What is the real benefit of using Raycast in ARKit and RealityKit?

假如想象 提交于 2020-01-23 14:53:07
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple

Distance between face and camera using ARKit

泄露秘密 提交于 2020-01-23 12:08:53
问题 Is there any where are you to get distance between camera and face using ARKit ? 回答1: I think this is possible by using the ARFaceAnchor.leftEyeTransform and ARFaceAnchor.rightEyeTransform properties. Once you have these you can get the approximate distance of the eyes to the Camera using the worldPosition of the Eyes and subtracting the position of the Camera e.g. SCNVector3Zero . Below is a very crude example with all the code commented so it should be easy enough to understand: //---------

ARKit 3.0 – Replicating robot character in Motion Capture RealityKit

£可爱£侵袭症+ 提交于 2020-01-23 10:53:06
问题 I'm trying to make a 3d model like robot provided by Apple in Motion Capture example (shown at WWDC 2019) which can mimic me in motion capture ARKit 3.0 by replacing robot character given by Apple. Desired Solution: Is there any special software which Apple used to create robot.usdz file? If yes, then please provide details for it? How can we convert formats like .glb / .gltf / .obj / .dae file to .usdz using Apple’s Python based tool without affecting it’s scene graph? How can we edit the

ARKit 3.0 – Replicating robot character in Motion Capture RealityKit

假装没事ソ 提交于 2020-01-23 10:52:13
问题 I'm trying to make a 3d model like robot provided by Apple in Motion Capture example (shown at WWDC 2019) which can mimic me in motion capture ARKit 3.0 by replacing robot character given by Apple. Desired Solution: Is there any special software which Apple used to create robot.usdz file? If yes, then please provide details for it? How can we convert formats like .glb / .gltf / .obj / .dae file to .usdz using Apple’s Python based tool without affecting it’s scene graph? How can we edit the

RealityKit – How to edit or add a Lighting?

拟墨画扇 提交于 2020-01-23 03:51:23
问题 I'm trying to add lighting in my RealityKit AR Scene. And I can't find the Lighting option in Reality Composer. If there's a way to add Directional Light or edit it then please tell me. I've tried Apple Documentation but can't understand how to add them. 回答1: At the moment you can't do it in Reality Composer. Just use a RealityKit. So, you need to create a custom class that inherits from Entity class and conforms to HasPointLight protocol. Run this code in macOS project to find out how a

How to place a 3D model of type OBJ with ARKit?

风流意气都作罢 提交于 2020-01-22 15:45:10
问题 I need some help with placing a 3D model with the new apple ARKit. Is it possible to place an object of type OBJ ? I'm trying to place a 3d model of a skull. //Load the OBJ file let bundle = Bundle.main guard let url = bundle.url(forResource: "Cranial", withExtension: "obj") else { fatalError("Failed to find model file") } let asset = MDLAsset(url:url) guard let object = asset.object(at: 0) as? MDLMesh else { fatalError("Failed to get mesh from asset") } let scene = SCNScene() let nodeCranial

Add SCNText to SCNScene with ARKit

元气小坏坏 提交于 2020-01-22 09:57:25
问题 I just started studying for ARKitexample and Scenekit. I read a few Scenekit and found out that in order to add text, I need to use SCNText. I try to write like this but it doesn't show. guard let pointOfView = sceneView.pointOfView else { return } let text = SCNText(string: "Hello", extrusionDepth: 4) let textNode = SCNNode(geometry: text) textNode.geometry = text textNode.position = SCNVector3Make(pointOfView.position.x, pointOfView.position.y, pointOfView.position.z) sceneView.scene

Swift - How to find direction of target node i.e. Left or Right from Current Camera Position

为君一笑 提交于 2020-01-16 19:35:08
问题 I am trying to achieve a scenario where i need to find a way to reach from one SCNNode to another SCNNode . I have added the node to sceneView on the click of screen i.e. tapEvent using following code. override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { guard let touch = touches.first else { return } let result = sceneView.hitTest(touch.location(in: sceneView), types: [.featurePoint]) guard let hitResult = result.last else { return } let hitTransform = SCNMatrix4

Swift - How to find direction of target node i.e. Left or Right from Current Camera Position

ぃ、小莉子 提交于 2020-01-16 19:34:28
问题 I am trying to achieve a scenario where i need to find a way to reach from one SCNNode to another SCNNode . I have added the node to sceneView on the click of screen i.e. tapEvent using following code. override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { guard let touch = touches.first else { return } let result = sceneView.hitTest(touch.location(in: sceneView), types: [.featurePoint]) guard let hitResult = result.last else { return } let hitTransform = SCNMatrix4