arkit

What is the real Focal Length of the camera used in RealityKit?

岁酱吖の 提交于 2020-08-19 07:20:24
问题 I am doing this Augmented Reality project starting from Xcode's default AR project. I need to know the focal length of the camera used by ARKit. This page defines Focal Length well: Focal length, usually represented in millimeters (mm), is the basic description of a photographic lens. It is not a measurement of the actual length of a lens, but a calculation of an optical distance from the point where light rays converge to form a sharp image of an object to the digital sensor or 35mm film at

Sectioning or cutting my 3D Object Horizontally (3D cross section) with ARKit or else in Swift

人盡茶涼 提交于 2020-08-10 17:44:07
问题 It's my first time working with Augmented Reality and ARKit. I want to know if I am missing something, I've been looking for a solution for a long time. What I would like to do is sectioning my 3D house horizontally so I can see my house floor by floor or sectioning the house by Y height; like it's shown in these two images from the CAD Exchanger software. I've seen some applications using this tool, but I couldn't find a way to do it. ( PS: I have a 3D house implemented by AutoCAD) Is there

Sectioning or cutting my 3D Object Horizontally (3D cross section) with ARKit or else in Swift

…衆ロ難τιáo~ 提交于 2020-08-10 17:43:01
问题 It's my first time working with Augmented Reality and ARKit. I want to know if I am missing something, I've been looking for a solution for a long time. What I would like to do is sectioning my 3D house horizontally so I can see my house floor by floor or sectioning the house by Y height; like it's shown in these two images from the CAD Exchanger software. I've seen some applications using this tool, but I couldn't find a way to do it. ( PS: I have a 3D house implemented by AutoCAD) Is there

Sectioning or cutting my 3D Object Horizontally (3D cross section) with ARKit or else in Swift

◇◆丶佛笑我妖孽 提交于 2020-08-10 17:42:25
问题 It's my first time working with Augmented Reality and ARKit. I want to know if I am missing something, I've been looking for a solution for a long time. What I would like to do is sectioning my 3D house horizontally so I can see my house floor by floor or sectioning the house by Y height; like it's shown in these two images from the CAD Exchanger software. I've seen some applications using this tool, but I couldn't find a way to do it. ( PS: I have a 3D house implemented by AutoCAD) Is there

RealityKit – Load another Scene from the same Reality Composer project

眉间皱痕 提交于 2020-08-03 08:14:26
问题 I create an Augmented Reality Project using Xcode's template. Xcode creates a file called Experience.rcproject . This project contains a scene called Box and a cube called Steel Cube . I add 3 more scenes to Experience.rcproject , called alpha , bravo and delta . I run the project. Xcode runs these two lines // Load the "Box" scene from the "Experience" Reality File let boxAnchor = try! Experience.loadBoxX(namedFile: "Ground") // Add the box anchor to the scene arView.scene.anchors.append

Swift: Get the TruthDepth camera parameters for face tracking in ARKit

生来就可爱ヽ(ⅴ<●) 提交于 2020-07-27 03:31:34
问题 My goal: I am trying to get the TruthDepth camera parameters (such as the intrinsic, extrinsic, lens distortion etc) for the TruthDepth camera while I am doing the face tracking. I read that there is examples and possible to that with OpenCV. I am just wondering should one achieve similar goals in Swift. What I have read and tried: I read that the apple documentation about ARCamera: intrinsics and AVCameraCalibrationData: extrinsicMatrix and intrinsicMatrix. However, all I found was just the

Swift: Get the TruthDepth camera parameters for face tracking in ARKit

纵然是瞬间 提交于 2020-07-27 03:30:51
问题 My goal: I am trying to get the TruthDepth camera parameters (such as the intrinsic, extrinsic, lens distortion etc) for the TruthDepth camera while I am doing the face tracking. I read that there is examples and possible to that with OpenCV. I am just wondering should one achieve similar goals in Swift. What I have read and tried: I read that the apple documentation about ARCamera: intrinsics and AVCameraCalibrationData: extrinsicMatrix and intrinsicMatrix. However, all I found was just the

FPS drop when adding child to a scene ARKit/SceneKit

偶尔善良 提交于 2020-07-17 07:40:06
问题 I'm working on an ARKit project for 4 months now. I noticed that when adding a child to my scene rootNode, there is a FPS drop. The device freezes for less than a second. I did a lot of research and trials, noticed that all Apple's code examples have this FPS drop too when placing an object. It does not matter if the node is added directly (scene.rootNode.addChild(child)) or if it's added in the renderer loop at different phases (didUpdateAtTime, didApplyAnimations etc...). I found that once

Swift: Access to ARKit camera parameters and save them [closed]

喜夏-厌秋 提交于 2020-07-16 10:44:14
问题 Closed . This question needs to be more focused. It is not currently accepting answers. Want to improve this question? Update the question so it focuses on one problem only by editing this post. Closed 2 days ago . Improve this question I am currently working on ARKit face tracking by projecting a 3D mesh on the face using the TruthDepth camera, based on this Facetracking with ARkit and Tracking and Visualizing Faces. I also read the Apple documentation about the The intrinsic matrix . And I

RealityKit – How to add a Video Material to a ModelEntity?

蓝咒 提交于 2020-07-15 09:05:50
问题 I use the code to add a picture texture in RealityKit and it works fine. var material = SimpleMaterial() material.baseColor = try! .texture(.load(named: "image.jpg")) I try to use this code to add a video file to be a texture, but it crashes!!! guard let url = Bundle.main.url(forResource: "data", withExtension: "mp4") else { return } material.baseColor = try! .texture(.load(contentsOf: url)) How can I add a video file? 回答1: You can use video textures only in RealityKit 2.0 (Xcode 12 with iOS