arkit

Reality Kit: Where are 'behaviours' created in reality composer stored in the .rcproject object?

拈花ヽ惹草 提交于 2020-02-05 04:05:49
问题 The situation I am making an AR app in xcode (11.3.1). I have added objects (e.g. a cube) into the scene using reality composer and added behaviours (i.e. tap and flip and look at camera) to those objects, also using reality composer. Saved that, switched to ViewController.swift In ViewController, I load in the Experience.rcproject and access the default Box scene by writing var box = try! Experience.loadBox() . All works as expected. I am then printing the various objects in the hierachy to

SceneKit - Rotate object around X and Z axis

拈花ヽ惹草 提交于 2020-02-04 15:55:37
问题 I’m using ARKit with SceneKit. When user presses a button I create an anchor and to the SCNNode corresponding to it I add a 3D object (loaded from a .scn file in the project). The 3D object is placed facing the camera, with the same orientation the camera has. I would like to make it look like the object is laying on a plane surface and not inclined if it is that way. So, if I got it right, I’d need to apply a rotation transformation so that it’s rotation around the X and Z axis become 0. My

How to programatically save arkit object in .dae format

邮差的信 提交于 2020-01-31 18:02:43
问题 I am working on an app the creates a 3d mesh of users face. I am successful in generating data of a users face. I want to programatically save this data in .dae format so that i could export my .dae file, edit that in 3d softwares like blender, and than further import it in my iphone and display that file in sceneview. Long story short i want programatically save the data in .dae format. I am not able to find anything on internet about this. If there could be another approach then please tell

How to programatically save arkit object in .dae format

余生长醉 提交于 2020-01-31 18:01:30
问题 I am working on an app the creates a 3d mesh of users face. I am successful in generating data of a users face. I want to programatically save this data in .dae format so that i could export my .dae file, edit that in 3d softwares like blender, and than further import it in my iphone and display that file in sceneview. Long story short i want programatically save the data in .dae format. I am not able to find anything on internet about this. If there could be another approach then please tell

How to make a 3D model from AVDepthData?

扶醉桌前 提交于 2020-01-31 04:01:06
问题 I’m interested in the issue of data processing from TrueDepth Camera. It is necessary to obtain the data of a person’s face, build a 3D model of the face and save this model in an .obj file. Since in the 3D model needed presence of the person’s eyes and teeth, then ARKit / SceneKit is not suitable, because ARKit / SceneKit do not fill these areas with data. But with the help of the SceneKit.ModelIO library, I managed to export ARSCNView.scene (type SCNScene) in the .obj format. I tried to

How can I reduce the opacity of the shadows in RealityKit?

风流意气都作罢 提交于 2020-01-29 09:53:53
问题 I composed a scene in Reality Composer and added 3 objects in it. The problem is that the shadows are too intense (dark). I tried using the Directional Light in RealityKit from this answer rather than a default light from Reality Composer (since you don't have an option to adjust light in it). Update I implemented the spotlight Lighting as explained by @AndyFedo in the answer. The shadow is still so dark. 回答1: In case you need soft and semi-transparent shadows in your scene , use SpotLight

Swift Load A 3d Asset from URL Xcode

↘锁芯ラ 提交于 2020-01-25 11:28:25
问题 I have a simple HTTP server running and Im trying to fetch this scenekit from my local server but IT shows me NIL error, or Error Loading Scene. I dont understand how to load this model from my simple local host. How to configure my code so that I will be able to fetch any Scenekit from remote or local server. Thanks in advance do { let shipScene = try SCNScene(url: URL(fileURLWithPath: "http://localhost:8080/chair.scn") , options: nil) // Set the scene to the view sceneView.scene = shipScene

How to translate X-axis correctly from VNFaceObservation boundingBox (Vision + ARKit)

此生再无相见时 提交于 2020-01-25 08:36:05
问题 I'm using both ARKit & Vision, following along Apple's sample project, "Using Vision in Real Time with ARKit". So I am not setting up my camera as ARKit handles that for me. Using Vision's VNDetectFaceRectanglesRequest, I'm able to get back a collection of VNFaceObservation objects. Following various guides online, I'm able to transform the VNFaceObservation's boundingBox to one that I can use on my ViewController's UIView. The Y-axis is correct when placed on my UIView in ARKit, but the X

Handling 3D Interaction and UI Controls in Augmented Reality

瘦欲@ 提交于 2020-01-25 07:30:08
问题 I'm playing with this Code example. And I'm having trouble finding where can I replace the existing 3d objects with my own 3d objects programmatically created? I want to keep all the existing functionality in the code example but with my own objects! Creating of 3d object. self.geometry = SCNBox(width: width!/110, height: height!/110, length: 57 / 700, chamferRadius: 0.008) self.geometry.firstMaterial?.diffuse.contents = UIColor.red self.geometry.firstMaterial?.specular.contents = UIColor

What is the real benefit of using Raycast in ARKit and RealityKit?

末鹿安然 提交于 2020-01-23 14:56:27
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple