realitykit

What is the real Focal Length of the camera used in RealityKit?

岁酱吖の 提交于 2020-08-19 07:20:24
问题 I am doing this Augmented Reality project starting from Xcode's default AR project. I need to know the focal length of the camera used by ARKit. This page defines Focal Length well: Focal length, usually represented in millimeters (mm), is the basic description of a photographic lens. It is not a measurement of the actual length of a lens, but a calculation of an optical distance from the point where light rays converge to form a sharp image of an object to the digital sensor or 35mm film at

RealityKit – Load another Scene from the same Reality Composer project

眉间皱痕 提交于 2020-08-03 08:14:26
问题 I create an Augmented Reality Project using Xcode's template. Xcode creates a file called Experience.rcproject . This project contains a scene called Box and a cube called Steel Cube . I add 3 more scenes to Experience.rcproject , called alpha , bravo and delta . I run the project. Xcode runs these two lines // Load the "Box" scene from the "Experience" Reality File let boxAnchor = try! Experience.loadBoxX(namedFile: "Ground") // Add the box anchor to the scene arView.scene.anchors.append

RealityKit – How to add a Video Material to a ModelEntity?

蓝咒 提交于 2020-07-15 09:05:50
问题 I use the code to add a picture texture in RealityKit and it works fine. var material = SimpleMaterial() material.baseColor = try! .texture(.load(named: "image.jpg")) I try to use this code to add a video file to be a texture, but it crashes!!! guard let url = Bundle.main.url(forResource: "data", withExtension: "mp4") else { return } material.baseColor = try! .texture(.load(contentsOf: url)) How can I add a video file? 回答1: You can use video textures only in RealityKit 2.0 (Xcode 12 with iOS

How do I access the model component of Reality Composer in RealityKit?

孤人 提交于 2020-07-09 07:58:08
问题 I'm trying to change the model component of a text entity created in Reality Composer in my code, but this as! casting the gui-created entity to a reference to an entity with a model component failed. self.entityReference = scene.realityComposerEntity as! HasModel textEntity.model!.mesh = MeshResource.generateText("New Text") The text entity in RealityKit should have a model property as it has a visual appearance in the ARView, but I don't know how to access it. Does anyone have any idea how?

Extract Reality Composer scene for ARQuickLook

↘锁芯ラ 提交于 2020-07-08 20:43:26
问题 I have a Reality Composer scene and I want to extract it as usdz file or any files that can be used in ARQuickLook ? is it possible? 回答1: From Apple's Creating 3D Content with Reality Composer document: You can also save your composition to a .reality file for use as a lightweight AR Quick Look experience in your app or on the web. This allows users to place and preview content in the real world to get a quick sense of what it’s like. To create a Reality file, choose File > Export > Export

HitTest prints AR Entity name even when I am not tapping on it

一笑奈何 提交于 2020-07-06 20:21:45
问题 My Experience.rcproject has animations that can be triggered by tap action. Two cylinders are named “Button 1” and “Button 2” and have Collide turned on. I am using Async method to load Experience.Map scene and addAnchor method to add mapAnchor to ARView in a ViewController. I tried to run HitTest on the scene to see if the app reacts properly. Nonetheless, the HitTest result prints the entity name of a button even when I am not tapping on it but area near it. class augmentedReality:

HitTest prints AR Entity name even when I am not tapping on it

我与影子孤独终老i 提交于 2020-07-06 20:21:24
问题 My Experience.rcproject has animations that can be triggered by tap action. Two cylinders are named “Button 1” and “Button 2” and have Collide turned on. I am using Async method to load Experience.Map scene and addAnchor method to add mapAnchor to ARView in a ViewController. I tried to run HitTest on the scene to see if the app reacts properly. Nonetheless, the HitTest result prints the entity name of a button even when I am not tapping on it but area near it. class augmentedReality:

How to detect the 2D images using ARKit and RealityKit

爷,独闯天下 提交于 2020-07-03 10:08:55
问题 I want to detect the 2D images using ARKit and RealityKit . I don't want to use SceneKit because many implementations based on RealityKit. I couldn't find any examples detecting images on RealityKit. I referred https://developer.apple.com/documentation/arkit/detecting_images_in_an_ar_experience sample code from apple. It uses Scenekit and ARSCNViewDelegate let arConfiguration = ARWorldTrackingConfiguration() arConfiguration.planeDetection = [.vertical, .horizontal] arConfiguration

SceneKit AR game fps getting low and the device getting hot with use

三世轮回 提交于 2020-06-27 16:16:42
问题 I'm developing a 3D game with ARKit and SceneKit. The game running smoothly at 60 fps but when I keep using it for a while the device get hot and the frame rate drop to 30 fps. The energy impact is very high and I noticed something in Instruments. I'll show the statistics and what I see in Instruments. This are the statistics when the game is running smoothly but I don't understand why I get 1.16K as the nodes count. I actually don't use so much nodes but it's just a simple level. This is

SceneKit AR game fps getting low and the device getting hot with use

徘徊边缘 提交于 2020-06-27 16:16:27
问题 I'm developing a 3D game with ARKit and SceneKit. The game running smoothly at 60 fps but when I keep using it for a while the device get hot and the frame rate drop to 30 fps. The energy impact is very high and I noticed something in Instruments. I'll show the statistics and what I see in Instruments. This are the statistics when the game is running smoothly but I don't understand why I get 1.16K as the nodes count. I actually don't use so much nodes but it's just a simple level. This is