augmented-reality

What is the real benefit of using Raycast in ARKit and RealityKit?

末鹿安然 提交于 2020-01-23 14:56:27
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple

What is the real benefit of using Raycast in ARKit and RealityKit?

浪尽此生 提交于 2020-01-23 14:55:19
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple

What is the real benefit of using Raycast in ARKit and RealityKit?

假如想象 提交于 2020-01-23 14:53:07
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple

ARKit 3.0 – Replicating robot character in Motion Capture RealityKit

£可爱£侵袭症+ 提交于 2020-01-23 10:53:06
问题 I'm trying to make a 3d model like robot provided by Apple in Motion Capture example (shown at WWDC 2019) which can mimic me in motion capture ARKit 3.0 by replacing robot character given by Apple. Desired Solution: Is there any special software which Apple used to create robot.usdz file? If yes, then please provide details for it? How can we convert formats like .glb / .gltf / .obj / .dae file to .usdz using Apple’s Python based tool without affecting it’s scene graph? How can we edit the

ARKit 3.0 – Replicating robot character in Motion Capture RealityKit

假装没事ソ 提交于 2020-01-23 10:52:13
问题 I'm trying to make a 3d model like robot provided by Apple in Motion Capture example (shown at WWDC 2019) which can mimic me in motion capture ARKit 3.0 by replacing robot character given by Apple. Desired Solution: Is there any special software which Apple used to create robot.usdz file? If yes, then please provide details for it? How can we convert formats like .glb / .gltf / .obj / .dae file to .usdz using Apple’s Python based tool without affecting it’s scene graph? How can we edit the

RealityKit – How to edit or add a Lighting?

拟墨画扇 提交于 2020-01-23 03:51:23
问题 I'm trying to add lighting in my RealityKit AR Scene. And I can't find the Lighting option in Reality Composer. If there's a way to add Directional Light or edit it then please tell me. I've tried Apple Documentation but can't understand how to add them. 回答1: At the moment you can't do it in Reality Composer. Just use a RealityKit. So, you need to create a custom class that inherits from Entity class and conforms to HasPointLight protocol. Run this code in macOS project to find out how a

How to place a 3D model of type OBJ with ARKit?

风流意气都作罢 提交于 2020-01-22 15:45:10
问题 I need some help with placing a 3D model with the new apple ARKit. Is it possible to place an object of type OBJ ? I'm trying to place a 3d model of a skull. //Load the OBJ file let bundle = Bundle.main guard let url = bundle.url(forResource: "Cranial", withExtension: "obj") else { fatalError("Failed to find model file") } let asset = MDLAsset(url:url) guard let object = asset.object(at: 0) as? MDLMesh else { fatalError("Failed to get mesh from asset") } let scene = SCNScene() let nodeCranial

Augmented Reality App Store Process

假装没事ソ 提交于 2020-01-16 18:25:10
问题 We have a client that has asked us to build an Augmented Reality iPhone App to be used at a show. The experience will be location based, ie you will need to be at the show where the markers will be placed to use the app. Does anyone know the process Apple take to test this kind of app? How will they be able to give approval if they cant test it without being on location and assess to the markers? Is it possible to submit the markers to them as files to print out and test?! Any info is MUCH

How do I rotate an object around only one axis in RealityKit?

痴心易碎 提交于 2020-01-16 06:22:58
问题 I'm trying to rotate a cube around its z-axis but I can't find how. Is there a way in RealityKit to do this? 回答1: In RealityKit there are, at least, two ways to rotate object around single axis : First approach: let boxAnchor = try! Experience.loadBox() boxAnchor.steelBox?.orientation = simd_quatf(angle: .pi/4, /* 45 Deg in Rad */ axis: [0, 0, 1]) /* Around Z axis */ Second approach: boxAnchor.steelBox?.transform = Transform(pitch: 0, yaw: 0, roll: .pi/4) /* Around Z axis */ pitch is X axis

Android Studio “Invalid hash string” warning in Compile Sdk Version

吃可爱长大的小学妹 提交于 2020-01-16 03:39:22
问题 I'm very new at Android app development and trying to apply some changes on an existing project. When I try to change the Compile Sdk Version in Project Structure in Android Studio, it displays a red warning (Invalid hash string) on the right side of the chosen Sdk "Vuzix Corporation:Vuzix M300 SDK:23" and doesn't apply the changes. And then I realised the same also happens for "Google Inc.:Google APIs:23". It's an app built for Vuzix M300 augmented reality glasses and I followed the