realitykit

Reality Kit: Where are 'behaviours' created in reality composer stored in the .rcproject object?

拈花ヽ惹草 提交于 2020-02-05 04:05:49
问题 The situation I am making an AR app in xcode (11.3.1). I have added objects (e.g. a cube) into the scene using reality composer and added behaviours (i.e. tap and flip and look at camera) to those objects, also using reality composer. Saved that, switched to ViewController.swift In ViewController, I load in the Experience.rcproject and access the default Box scene by writing var box = try! Experience.loadBox() . All works as expected. I am then printing the various objects in the hierachy to

How can I reduce the opacity of the shadows in RealityKit?

风流意气都作罢 提交于 2020-01-29 09:53:53
问题 I composed a scene in Reality Composer and added 3 objects in it. The problem is that the shadows are too intense (dark). I tried using the Directional Light in RealityKit from this answer rather than a default light from Reality Composer (since you don't have an option to adjust light in it). Update I implemented the spotlight Lighting as explained by @AndyFedo in the answer. The shadow is still so dark. 回答1: In case you need soft and semi-transparent shadows in your scene , use SpotLight

What is the real benefit of using Raycast in ARKit and RealityKit?

末鹿安然 提交于 2020-01-23 14:56:27
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple

What is the real benefit of using Raycast in ARKit and RealityKit?

浪尽此生 提交于 2020-01-23 14:55:19
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple

What is the real benefit of using Raycast in ARKit and RealityKit?

假如想象 提交于 2020-01-23 14:53:07
问题 What is a ray-casting in ARKit and RealityKit for? And when I need to use a makeRaycastQuery instance method: func makeRaycastQuery(from point: CGPoint, allowing target: ARRaycastQuery.Target, alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery? Any help appreciated. 回答1: Ray-Casting , the same way as Hit-Testing , helps find a 3D position on a real-world surface by projecting an imaginary ray from a screen point. I've found a following definition of ray-casting in Apple

ARKit 3.0 – Replicating robot character in Motion Capture RealityKit

£可爱£侵袭症+ 提交于 2020-01-23 10:53:06
问题 I'm trying to make a 3d model like robot provided by Apple in Motion Capture example (shown at WWDC 2019) which can mimic me in motion capture ARKit 3.0 by replacing robot character given by Apple. Desired Solution: Is there any special software which Apple used to create robot.usdz file? If yes, then please provide details for it? How can we convert formats like .glb / .gltf / .obj / .dae file to .usdz using Apple’s Python based tool without affecting it’s scene graph? How can we edit the

ARKit 3.0 – Replicating robot character in Motion Capture RealityKit

假装没事ソ 提交于 2020-01-23 10:52:13
问题 I'm trying to make a 3d model like robot provided by Apple in Motion Capture example (shown at WWDC 2019) which can mimic me in motion capture ARKit 3.0 by replacing robot character given by Apple. Desired Solution: Is there any special software which Apple used to create robot.usdz file? If yes, then please provide details for it? How can we convert formats like .glb / .gltf / .obj / .dae file to .usdz using Apple’s Python based tool without affecting it’s scene graph? How can we edit the

RealityKit – How to edit or add a Lighting?

拟墨画扇 提交于 2020-01-23 03:51:23
问题 I'm trying to add lighting in my RealityKit AR Scene. And I can't find the Lighting option in Reality Composer. If there's a way to add Directional Light or edit it then please tell me. I've tried Apple Documentation but can't understand how to add them. 回答1: At the moment you can't do it in Reality Composer. Just use a RealityKit. So, you need to create a custom class that inherits from Entity class and conforms to HasPointLight protocol. Run this code in macOS project to find out how a

How do I rotate an object around only one axis in RealityKit?

痴心易碎 提交于 2020-01-16 06:22:58
问题 I'm trying to rotate a cube around its z-axis but I can't find how. Is there a way in RealityKit to do this? 回答1: In RealityKit there are, at least, two ways to rotate object around single axis : First approach: let boxAnchor = try! Experience.loadBox() boxAnchor.steelBox?.orientation = simd_quatf(angle: .pi/4, /* 45 Deg in Rad */ axis: [0, 0, 1]) /* Around Z axis */ Second approach: boxAnchor.steelBox?.transform = Transform(pitch: 0, yaw: 0, roll: .pi/4) /* Around Z axis */ pitch is X axis

How are the ARKit People Occlusion samples being done?

好久不见. 提交于 2020-01-12 10:46:19
问题 This may be an obscure question, but I see lots of very cool samples online of how people are using the new ARKit people occlusion technology in ARKit 3 to effectively "separate" the people from the background, and apply some sort of filtering to the "people" (see here). In looking at Apple's provided source code and documentation, I see that I can retrieve the segmentationBuffer from an ARFrame, which I've done, like so; func session(_ session: ARSession, didUpdate frame: ARFrame) { let