arkit

How to detect penetration without physical interaction in ARKit?

不想你离开。 提交于 2021-02-20 00:59:06
问题 I have a fixed laser beam SCNNode and a detecting sphere SCNNode attached in front of a camera. How to detect penetration without physical interaction? I have not found any clue.. EDIT: - as suggested below by maxxFrazer I implemented physical interaction and I'm able to register collision IF my laser beam is .static and detector moved by camera set .kinematic. 回答1: If you need to detect a penetration without physical interaction use trackedRaycast instance method working in iOS 13+: func

How to draw line node keep same size in camera as Measure App in iPhone?

隐身守侯 提交于 2021-02-19 09:00:23
问题 I try make an AR app as a Measure default app in iPhone. ( I base on project TBXark/Ruler on github) I draw startNode, endNode, cylinder line, and SCNText. But I can't manage the scale of size, it only readable in near, and so small when measure far plane detect. I have 2 question: How to keep size node, cylinder and text same when draw near or far as Measure App How to draw scntext with background and align the same direction cylinder line as Measure App. Here is my Line Node class: class

How do I save and load ARWorldMap in SwiftUI app?

岁酱吖の 提交于 2021-02-19 05:38:26
问题 I followed a YouTube tutorial on how to place virtual Models in SwiftUI. Now that I can place my Models, I would like to save and load the models position. I have already made 2 Buttons for save and load, but I don't know the correct code to save and load the Entities and Anchors. The following code is inside my updateUIView function: func updateUIView(_ uiView: ARView, context: Context) { if let name = self.modelName { print("Modell mit dem Namen \(name) wurde zum plaziert") let filename =

iOS ARKit: Large size object always appears to move with the change in the position of the device camera

雨燕双飞 提交于 2021-02-18 16:44:23
问题 I am creating an iOS ARKit app where I wanted to place a large object in Augmented Reality. When I am trying to place the object at a particular position it always appears to be moving with the change in camera position and I am not able to view the object from all angles by changing the camera position. But if I reduce it's scale value to 0.001 (Reducing the size of the object), I am able to view the object from all angles and the position of the placed object also does not change to that

How can I get the yaw, pitch, roll of an ARAnchor in absolute terms?

心不动则不痛 提交于 2021-02-18 03:24:12
问题 I've been trying to figure this out for a few days now. Given an ARKit-based app where I track a user's face, how can I get the face's rotation in absolute terms, from its anchor? I can get the transform of the ARAnchor, which is a simd_matrix4x4. There's a lot of info on how to get the position out of that matrix (it's the 3rd column), but nothing on the rotation! I want to be able to control a 3D object outside of the app, by passing YAW, PITCH and ROLL. The latest I thing I tried actually

How can I get the yaw, pitch, roll of an ARAnchor in absolute terms?

与世无争的帅哥 提交于 2021-02-18 03:21:03
问题 I've been trying to figure this out for a few days now. Given an ARKit-based app where I track a user's face, how can I get the face's rotation in absolute terms, from its anchor? I can get the transform of the ARAnchor, which is a simd_matrix4x4. There's a lot of info on how to get the position out of that matrix (it's the 3rd column), but nothing on the rotation! I want to be able to control a 3D object outside of the app, by passing YAW, PITCH and ROLL. The latest I thing I tried actually

How to get the lens position on ARKit 1.5

两盒软妹~` 提交于 2021-02-11 14:41:23
问题 Before ARKit 1.5, we had no way to adjust the focus of the camera and getting the lens position would always return the same value. With ARKit 1.5, however, we can now use autofocus by setting ARWorldTrackingConfiguration.isAutoFocusEnabled . My question is that, is there any way to get the current lens position from ARKit so that I can apply an out-of-focus effect on my virtual objects? I had a look at some classes where this information may be stored, like ARFrame or ARSession , but they

ARKit Calc euler angles from rotation matrix in YXZ world

三世轮回 提交于 2021-02-11 13:00:25
问题 I need help in math in general for euler angles and specifically in ARKit. I looked at many references for calculation of euler angles but I wondered why apple in its tutorial calculated yaw from atan2(r11, r12) as follow: let yaw = atan2f(camera.transform.columns.0.x, camera.transform.columns.1.x) If I need to calculate pitch or roll, for example how? I need to understand yaw and why they not always depends on yaw = camera.eularAngle.y Please check this code and questions in comments, I'm

SpriteKit SKLabelNode attached to ARAnchor doesn't appear or appears fullscreen

会有一股神秘感。 提交于 2021-02-11 12:47:20
问题 I'm adding an anchor to my sceneView in the world origin position: let configuration = ARWorldTrackingConfiguration() sceneView.session.run(configuration) sceneView.session.add(anchor: ARAnchor(name: "world origin", transform: matrix_identity_float4x4)) sceneView.presentScene(SKScene()) I then choose to show a SKLabelNode at my anchor position func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? { // Create and configure a node for the anchor added to the view's session. let

Implement a crosshair kind behaviour in RealityKit

我们两清 提交于 2021-02-10 17:45:50
问题 What I want to achieve: Attach a sphere to the camera position (so that it always stay at the center of the screen as the device move) and detect when it is on top of other AR objects - to trigger other actions/behaviour on the AR objects. Approach: I have created the sphere and attached to the center of the screen as shown below @IBOutlet var arView: ARView! override func viewDidLoad() { super.viewDidLoad() let mesh = MeshResource.generateSphere(radius: 0.1) let sphere = ModelEntity(mesh: