augmented-reality

Sectioning or cutting my 3D Object Horizontally (3D cross section) with ARKit or else in Swift

…衆ロ難τιáo~ 提交于 2020-08-10 17:43:01
问题 It's my first time working with Augmented Reality and ARKit. I want to know if I am missing something, I've been looking for a solution for a long time. What I would like to do is sectioning my 3D house horizontally so I can see my house floor by floor or sectioning the house by Y height; like it's shown in these two images from the CAD Exchanger software. I've seen some applications using this tool, but I couldn't find a way to do it. ( PS: I have a 3D house implemented by AutoCAD) Is there

Sectioning or cutting my 3D Object Horizontally (3D cross section) with ARKit or else in Swift

◇◆丶佛笑我妖孽 提交于 2020-08-10 17:42:25
问题 It's my first time working with Augmented Reality and ARKit. I want to know if I am missing something, I've been looking for a solution for a long time. What I would like to do is sectioning my 3D house horizontally so I can see my house floor by floor or sectioning the house by Y height; like it's shown in these two images from the CAD Exchanger software. I've seen some applications using this tool, but I couldn't find a way to do it. ( PS: I have a 3D house implemented by AutoCAD) Is there

RealityKit – Load another Scene from the same Reality Composer project

眉间皱痕 提交于 2020-08-03 08:14:26
问题 I create an Augmented Reality Project using Xcode's template. Xcode creates a file called Experience.rcproject . This project contains a scene called Box and a cube called Steel Cube . I add 3 more scenes to Experience.rcproject , called alpha , bravo and delta . I run the project. Xcode runs these two lines // Load the "Box" scene from the "Experience" Reality File let boxAnchor = try! Experience.loadBoxX(namedFile: "Ground") // Add the box anchor to the scene arView.scene.anchors.append

Swift: Get the TruthDepth camera parameters for face tracking in ARKit

生来就可爱ヽ(ⅴ<●) 提交于 2020-07-27 03:31:34
问题 My goal: I am trying to get the TruthDepth camera parameters (such as the intrinsic, extrinsic, lens distortion etc) for the TruthDepth camera while I am doing the face tracking. I read that there is examples and possible to that with OpenCV. I am just wondering should one achieve similar goals in Swift. What I have read and tried: I read that the apple documentation about ARCamera: intrinsics and AVCameraCalibrationData: extrinsicMatrix and intrinsicMatrix. However, all I found was just the

Swift: Get the TruthDepth camera parameters for face tracking in ARKit

纵然是瞬间 提交于 2020-07-27 03:30:51
问题 My goal: I am trying to get the TruthDepth camera parameters (such as the intrinsic, extrinsic, lens distortion etc) for the TruthDepth camera while I am doing the face tracking. I read that there is examples and possible to that with OpenCV. I am just wondering should one achieve similar goals in Swift. What I have read and tried: I read that the apple documentation about ARCamera: intrinsics and AVCameraCalibrationData: extrinsicMatrix and intrinsicMatrix. However, all I found was just the

SparkAR - can't change texture of material in code?

一世执手 提交于 2020-07-23 06:41:43
问题 I've here made sure to find all objects, Materials and Textures in the Promise.all of my script given that they take time to load in. I then set my textures to my materials, and no errors are drawn. However, the materials do not change. I can't find anything wrong with my code: Promise.all([ //These take time to acquire.. Scene.root.findFirst('ipad-perfect'), Scene.root.findFirst('iphone-perfect'), Scene.root.findFirst('computer-perfect'), Materials.findFirst('bg'), Materials.findFirst(

SparkAR - can't change texture of material in code?

三世轮回 提交于 2020-07-23 06:40:40
问题 I've here made sure to find all objects, Materials and Textures in the Promise.all of my script given that they take time to load in. I then set my textures to my materials, and no errors are drawn. However, the materials do not change. I can't find anything wrong with my code: Promise.all([ //These take time to acquire.. Scene.root.findFirst('ipad-perfect'), Scene.root.findFirst('iphone-perfect'), Scene.root.findFirst('computer-perfect'), Materials.findFirst('bg'), Materials.findFirst(

Google Sceneform – Is it deprecated? Any replacement?

不羁岁月 提交于 2020-07-17 10:28:26
问题 I use in my ARCore project Sceneform. It seems that this project is now mentioned as Archived by Google. More info we can find here or on this page. I don't understand if Google really abandoned this SDK, or if it is - or will be - directly integrated in ARCore SDK? Thanks for any information concerning the future of this SDK and a potential replacements. 回答1: About 3 last versions At the moment there are three last versions of Sceneform : Sceneform 1.17 (at the moment working with artifacts)

FPS drop when adding child to a scene ARKit/SceneKit

偶尔善良 提交于 2020-07-17 07:40:06
问题 I'm working on an ARKit project for 4 months now. I noticed that when adding a child to my scene rootNode, there is a FPS drop. The device freezes for less than a second. I did a lot of research and trials, noticed that all Apple's code examples have this FPS drop too when placing an object. It does not matter if the node is added directly (scene.rootNode.addChild(child)) or if it's added in the renderer loop at different phases (didUpdateAtTime, didApplyAnimations etc...). I found that once