realitykit

How do I make an entity a physics entity in RealityKit?

﹥>﹥吖頭↗ 提交于 2020-05-26 09:24:11
问题 I am not able to figure out how to make the "ball" entity a physics entity / body and apply a force to it. // I'm using UIKit for the user interface and RealityKit + // the models made in Reality Composer for the Augmented reality and Code import RealityKit import ARKit class ViewController: UIViewController { var ball: (Entity & HasPhysics)? { try? Entity.load(named: "golfball") as? Entity & HasPhysics } @IBOutlet var arView: ARView! // referencing the play now button on the home screen

RealityKit – How to set a ModelEntity's transparency?

拜拜、爱过 提交于 2020-05-26 08:21:27
问题 In SceneKit, there are lots of options such as Use alpha channel of UIColor via SCNMaterial.(diffuse|emission|ambient|...).contents Use SCNMaterial.transparency (a CGFloat from 0.0 to 1.0) Use SCNMaterial.transparent (another SCNMaterialProperty) Use SCNNode.opacity (a CGFloat from 0.0 (fully transparent) to 1.0 (fully opaque)) I wonder if there is a way to set transparency/opacity/alpha for ModelEntity in RealityKit? 回答1: At the moment I see at least one solution in RealityKit allowing you

RealityKit – How to set a ModelEntity's transparency?

≯℡__Kan透↙ 提交于 2020-05-26 08:21:23
问题 In SceneKit, there are lots of options such as Use alpha channel of UIColor via SCNMaterial.(diffuse|emission|ambient|...).contents Use SCNMaterial.transparency (a CGFloat from 0.0 to 1.0) Use SCNMaterial.transparent (another SCNMaterialProperty) Use SCNNode.opacity (a CGFloat from 0.0 (fully transparent) to 1.0 (fully opaque)) I wonder if there is a way to set transparency/opacity/alpha for ModelEntity in RealityKit? 回答1: At the moment I see at least one solution in RealityKit allowing you

RealityKit - Animate opacity of a ModelEntity?

怎甘沉沦 提交于 2020-02-28 08:48:47
问题 By setting the color of a material on the model property of a ModelEntity , I can alter the opacity/alpha of an object. But how do you animate this? My goal is to animate objects with full opacity, then have them fade to a set opacity, such as 50%. With SCNAction.fadeOpacity on a SCNNode in SceneKit , this was particularly easy. let fade = SCNAction.fadeOpacity(by: 0.5, duration: 0.5) node.runAction(fade) An Entity conforms to HasTransform , but that will only allow you to animate scale,

How to use Raycast methods in RealityKit?

廉价感情. 提交于 2020-02-23 05:23:53
问题 There are three ways about Detecting Intersections in RealityKit framework, but I don't know how to use it in my project. 1. func raycast(origin: SIMD3<Float>, direction: SIMD3<Float>, length: Float, query: CollisionCastQueryType, mask: CollisionGroup, relativeTo: Entity?) -> [CollisionCastHit] 2. func raycast(from: SIMD3<Float>, to: SIMD3<Float>, query: CollisionCastQueryType, mask: CollisionGroup, relativeTo: Entity?) -> [CollisionCastHit] 3. func convexCast(convexShape: ShapeResource,

How to use Raycast methods in RealityKit?

前提是你 提交于 2020-02-23 05:23:19
问题 There are three ways about Detecting Intersections in RealityKit framework, but I don't know how to use it in my project. 1. func raycast(origin: SIMD3<Float>, direction: SIMD3<Float>, length: Float, query: CollisionCastQueryType, mask: CollisionGroup, relativeTo: Entity?) -> [CollisionCastHit] 2. func raycast(from: SIMD3<Float>, to: SIMD3<Float>, query: CollisionCastQueryType, mask: CollisionGroup, relativeTo: Entity?) -> [CollisionCastHit] 3. func convexCast(convexShape: ShapeResource,

Can I track more than 4 images at a time with ARKit?

℡╲_俬逩灬. 提交于 2020-02-14 13:04:03
问题 Out of the box it's pretty clear ARKit doesn't allow for the tracking of more than 4 images at once. (You can "track" more markers than that but only 4 will function at a time). See this question for more details on that. However, I'm wondering if there is a possible work-around. Something like adding and removing anchors on a timer or getting the position information and then displaying the corresponding models without ARKit, etc. My knowledge of Swift is fairly limited so I haven't had much

Can I track more than 4 images at a time with ARKit?

馋奶兔 提交于 2020-02-14 13:03:07
问题 Out of the box it's pretty clear ARKit doesn't allow for the tracking of more than 4 images at once. (You can "track" more markers than that but only 4 will function at a time). See this question for more details on that. However, I'm wondering if there is a possible work-around. Something like adding and removing anchors on a timer or getting the position information and then displaying the corresponding models without ARKit, etc. My knowledge of Swift is fairly limited so I haven't had much

ARKit & Reality composer - how to Anchor scene using image coordinates

别来无恙 提交于 2020-02-06 06:42:05
问题 I have written code to initialise one of 3 Reality Composer scenes when a button is pressed depending on the day of the month. That all works fine. The Reality Composer scenes use an image detection to place the objects within the environment but currently as soon as the image is out of the camera view the objects disappear. I would like to anchor the scene with the root node being where the image is first detected so that users can look around the scene and the objects are maintained even

ARKit & Reality composer - how to Anchor scene using image coordinates

走远了吗. 提交于 2020-02-06 06:41:48
问题 I have written code to initialise one of 3 Reality Composer scenes when a button is pressed depending on the day of the month. That all works fine. The Reality Composer scenes use an image detection to place the objects within the environment but currently as soon as the image is out of the camera view the objects disappear. I would like to anchor the scene with the root node being where the image is first detected so that users can look around the scene and the objects are maintained even