augmented-reality

Swift: Access to ARKit camera parameters and save them [closed]

喜夏-厌秋 提交于 2020-07-16 10:44:14
问题 Closed . This question needs to be more focused. It is not currently accepting answers. Want to improve this question? Update the question so it focuses on one problem only by editing this post. Closed 2 days ago . Improve this question I am currently working on ARKit face tracking by projecting a 3D mesh on the face using the TruthDepth camera, based on this Facetracking with ARkit and Tracking and Visualizing Faces. I also read the Apple documentation about the The intrinsic matrix . And I

RealityKit – How to add a Video Material to a ModelEntity?

蓝咒 提交于 2020-07-15 09:05:50
问题 I use the code to add a picture texture in RealityKit and it works fine. var material = SimpleMaterial() material.baseColor = try! .texture(.load(named: "image.jpg")) I try to use this code to add a video file to be a texture, but it crashes!!! guard let url = Bundle.main.url(forResource: "data", withExtension: "mp4") else { return } material.baseColor = try! .texture(.load(contentsOf: url)) How can I add a video file? 回答1: You can use video textures only in RealityKit 2.0 (Xcode 12 with iOS

Projecting the ARKit face tracking 3D mesh to 2D image coordinates

时光怂恿深爱的人放手 提交于 2020-07-15 08:42:26
问题 I am collecting face mesh 3D vertices using ARKit. I have read: Mapping image onto 3D face mesh and Tracking and Visualizing Faces. I have the following struct: struct CaptureData { var vertices: [SIMD3<Float>] var verticesformatted: String { let verticesDescribed = vertices.map({ "\($0.x):\($0.y):\($0.z)" }).joined(separator: "~") return "<\(verticesDescribed)>" } } I have a Strat button to capture vertices: @IBAction private func startPressed() { captureData = [] // Clear data

Projecting the ARKit face tracking 3D mesh to 2D image coordinates

我与影子孤独终老i 提交于 2020-07-15 08:42:26
问题 I am collecting face mesh 3D vertices using ARKit. I have read: Mapping image onto 3D face mesh and Tracking and Visualizing Faces. I have the following struct: struct CaptureData { var vertices: [SIMD3<Float>] var verticesformatted: String { let verticesDescribed = vertices.map({ "\($0.x):\($0.y):\($0.z)" }).joined(separator: "~") return "<\(verticesDescribed)>" } } I have a Strat button to capture vertices: @IBAction private func startPressed() { captureData = [] // Clear data

SCNNode is not appearing in correct position

别来无恙 提交于 2020-07-10 05:59:08
问题 I downloaded the wall2.obj file from Google Blocks and then I used Blender to change the extension to .dae the wall node is not appearing where it should be. Where the wall node placed vs. where it should be placed: 回答1: It's a model's pivot point issue In Blender change a pivot point 's position to the desired one – move pivot point to the center of your model, and then at its bottom. If your model consists of several independent parts, group all parts and use just one composite pivot point.

SCNNode is not appearing in correct position

我是研究僧i 提交于 2020-07-10 05:58:21
问题 I downloaded the wall2.obj file from Google Blocks and then I used Blender to change the extension to .dae the wall node is not appearing where it should be. Where the wall node placed vs. where it should be placed: 回答1: It's a model's pivot point issue In Blender change a pivot point 's position to the desired one – move pivot point to the center of your model, and then at its bottom. If your model consists of several independent parts, group all parts and use just one composite pivot point.

How do I access the model component of Reality Composer in RealityKit?

孤人 提交于 2020-07-09 07:58:08
问题 I'm trying to change the model component of a text entity created in Reality Composer in my code, but this as! casting the gui-created entity to a reference to an entity with a model component failed. self.entityReference = scene.realityComposerEntity as! HasModel textEntity.model!.mesh = MeshResource.generateText("New Text") The text entity in RealityKit should have a model property as it has a visual appearance in the ARView, but I don't know how to access it. Does anyone have any idea how?

Extract Reality Composer scene for ARQuickLook

↘锁芯ラ 提交于 2020-07-08 20:43:26
问题 I have a Reality Composer scene and I want to extract it as usdz file or any files that can be used in ARQuickLook ? is it possible? 回答1: From Apple's Creating 3D Content with Reality Composer document: You can also save your composition to a .reality file for use as a lightweight AR Quick Look experience in your app or on the web. This allows users to place and preview content in the real world to get a quick sense of what it’s like. To create a Reality file, choose File > Export > Export

HitTest prints AR Entity name even when I am not tapping on it

一笑奈何 提交于 2020-07-06 20:21:45
问题 My Experience.rcproject has animations that can be triggered by tap action. Two cylinders are named “Button 1” and “Button 2” and have Collide turned on. I am using Async method to load Experience.Map scene and addAnchor method to add mapAnchor to ARView in a ViewController. I tried to run HitTest on the scene to see if the app reacts properly. Nonetheless, the HitTest result prints the entity name of a button even when I am not tapping on it but area near it. class augmentedReality:

HitTest prints AR Entity name even when I am not tapping on it

我与影子孤独终老i 提交于 2020-07-06 20:21:24
问题 My Experience.rcproject has animations that can be triggered by tap action. Two cylinders are named “Button 1” and “Button 2” and have Collide turned on. I am using Async method to load Experience.Map scene and addAnchor method to add mapAnchor to ARView in a ViewController. I tried to run HitTest on the scene to see if the app reacts properly. Nonetheless, the HitTest result prints the entity name of a button even when I am not tapping on it but area near it. class augmentedReality: