augmented-reality

ARKit – Viewport Size vs Real Screen Resolution

淺唱寂寞╮ 提交于 2020-05-28 07:44:40
问题 I am writing an ARKit app that uses ARSCNView hitTest function. Also the app sends captured images to the server for some analysis. I notices when I do: let viewportSize = sceneView.snapshot().size let viewSize = sceneView.bounds.size then the first one is twice as large as the second one. The questions are: 1.Why there is a difference? 2.What "size" (e.g. coordinates) is used in hitTest? 回答1: Why there is a difference? Let's explore some important display characteristics of your iPhone 7 : a

Where is the .camera AnchorEntity located?

佐手、 提交于 2020-05-28 04:27:45
问题 When adding a child to my AnchorEntity(.camera) , it appears as if the child is spawning behind my camera (meaning I can only see my child when I turn around). I have also tried to add a mesh to my Anchor directly but unfortunately ARKit / RealityKit does not render the mesh when you are inside of it (which because its centered around the camera, is theoretically always the case. However, it could also be the case that its always located behind the screen [where the user is] and I'm never

How do I make an entity a physics entity in RealityKit?

﹥>﹥吖頭↗ 提交于 2020-05-26 09:24:11
问题 I am not able to figure out how to make the "ball" entity a physics entity / body and apply a force to it. // I'm using UIKit for the user interface and RealityKit + // the models made in Reality Composer for the Augmented reality and Code import RealityKit import ARKit class ViewController: UIViewController { var ball: (Entity & HasPhysics)? { try? Entity.load(named: "golfball") as? Entity & HasPhysics } @IBOutlet var arView: ARView! // referencing the play now button on the home screen

RealityKit – How to set a ModelEntity's transparency?

拜拜、爱过 提交于 2020-05-26 08:21:27
问题 In SceneKit, there are lots of options such as Use alpha channel of UIColor via SCNMaterial.(diffuse|emission|ambient|...).contents Use SCNMaterial.transparency (a CGFloat from 0.0 to 1.0) Use SCNMaterial.transparent (another SCNMaterialProperty) Use SCNNode.opacity (a CGFloat from 0.0 (fully transparent) to 1.0 (fully opaque)) I wonder if there is a way to set transparency/opacity/alpha for ModelEntity in RealityKit? 回答1: At the moment I see at least one solution in RealityKit allowing you

RealityKit – How to set a ModelEntity's transparency?

≯℡__Kan透↙ 提交于 2020-05-26 08:21:23
问题 In SceneKit, there are lots of options such as Use alpha channel of UIColor via SCNMaterial.(diffuse|emission|ambient|...).contents Use SCNMaterial.transparency (a CGFloat from 0.0 to 1.0) Use SCNMaterial.transparent (another SCNMaterialProperty) Use SCNNode.opacity (a CGFloat from 0.0 (fully transparent) to 1.0 (fully opaque)) I wonder if there is a way to set transparency/opacity/alpha for ModelEntity in RealityKit? 回答1: At the moment I see at least one solution in RealityKit allowing you

What is the easiest way to convert .obj or .stl to .usdz?

北城以北 提交于 2020-05-26 04:59:41
问题 Apple AR Quick Look apparently only supports .usdz files. So, what is the easiest way to convert an .obj or .stl to .usdz ? I googled this first but the most popular result was to use a free tool called Vectary , but when I actually tried to use it, it wasn’t free. Thanks. 回答1: With iOS 13 & Reality Kit, Apple has released a USDZ converter that does not require Xcode; the converter is a Python based tool. I have been using it the last few days to convert various files gltf, obj files.

SCNNode not showing in ARFrame's capturedImage

家住魔仙堡 提交于 2020-05-15 09:18:55
问题 I added a SCNNode to a ARSCNView: func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? { guard let faceAnchor = anchor as? ARFaceAnchor else { return nil } guard let device = sceneView.device else { return nil } guard let faceGeometry = ARSCNFaceGeometry(device: device, fillMesh: true) else { return nil } let faceNode = FaceNode(faceGeometry) // Node is a custom SCNNode class let glassesNode = Node(image: UIImage(named: "glasses")!, position: .glasses, anchor:

SCNNode not showing in ARFrame's capturedImage

馋奶兔 提交于 2020-05-15 09:17:56
问题 I added a SCNNode to a ARSCNView: func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? { guard let faceAnchor = anchor as? ARFaceAnchor else { return nil } guard let device = sceneView.device else { return nil } guard let faceGeometry = ARSCNFaceGeometry(device: device, fillMesh: true) else { return nil } let faceNode = FaceNode(faceGeometry) // Node is a custom SCNNode class let glassesNode = Node(image: UIImage(named: "glasses")!, position: .glasses, anchor:

ARFaceTrackingConfiguration: How to distinguish pictures from real faces?

守給你的承諾、 提交于 2020-05-10 21:00:15
问题 we have several apps in the Store that use ARFaceTrackingConfiguration to detect the users face in iOS devices with FaceID cameras. As you might have seen, ARKit will also track picture of faces you put in front of your iPad Pro/iPhoneX, as if they were faces. E.g. take a picture from one of our apps (to replicate one can download&run Apples example app for ARFaceTrackingConfiguration): Now I have noticed that internally ARKit treats real faces differently then it does pictures of faces.

ARFaceTrackingConfiguration: How to distinguish pictures from real faces?

不羁的心 提交于 2020-05-10 20:56:36
问题 we have several apps in the Store that use ARFaceTrackingConfiguration to detect the users face in iOS devices with FaceID cameras. As you might have seen, ARKit will also track picture of faces you put in front of your iPad Pro/iPhoneX, as if they were faces. E.g. take a picture from one of our apps (to replicate one can download&run Apples example app for ARFaceTrackingConfiguration): Now I have noticed that internally ARKit treats real faces differently then it does pictures of faces.