arkit

How to keep ARKit SCNNode in place

北慕城南 提交于 2019-11-30 10:30:46
问题 Hey I'm trying to figure out. How to keep a simple node in place. As I walk around it in ARKit Code: func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { if let planeAnchor = anchor as? ARPlaneAnchor { if planeDetected == false { // Bool only allows 1 plane to be added planeDetected = true self.addPlane(node: node, anchor: planeAnchor) } } } This adds the SCNNode func addPlane(node: SCNNode, anchor: ARPlaneAnchor) { // We add the anchor plane here let

What does the different columns in transform in ARKit represent?

落爺英雄遲暮 提交于 2019-11-30 07:52:51
问题 An ARAnchor has 6 columns of which the last 3 represent the x , y , and z coordinates. I was wondering what the other (first) 3 columns represent? 回答1: If you're new to 3D then these transformation matrices will seem like magic. Basically, every "point" in ARKit space is represented by a 4x4 transform matrix. This matrix describes the distance from the ARKit origin (the point at which ARKit woke up to the world), commonly known as the translation, and the orientation of the device, aka pitch,

Does ARKit 2.0 consider Lens Distortion in iPhone and iPad?

谁说我不能喝 提交于 2019-11-30 07:25:50
ARKit 2.0 updates many intrinsic (and extrinsic) parameters of the ARCamera from frame to frame. I'd like to know if it also takes Radial Lens Distortion into consideration (like in AVCameraCalibrationData class that ARKit doesn't use), and fix the video frames' distortion appropriately ( distort / undistort operations) for back iPhone and iPad cameras? var intrinsics: simd_float3x3 { get } As we all know, the Radial Lens Distortion greatly affects the 6 DOF pose estimation accuracy when we place undistorted 3D objects in distorted by a lens real world scene. var lensDistortionLookupTable:

Understand coordinate spaces in ARKit

北慕城南 提交于 2019-11-30 07:24:07
I've read all Apple guides about ARKit , and watched a WWDC video . But I can't understand how do coordinate systems which are bind to: A real world A device A 3D scene connect to each other. I can add an object, for example a SCNPlane : let stripe = SCNPlane(width: 0.005, height: 0.1) let stripeNode = SCNNode(geometry: stripe) scene.rootNode.addChildNode(stripeNode) This will produce a white stripe, which will be oriented vertically, no matter how the device will be oriented at that moment. That means the coordinate system is somehow bound to the gravity! But if I try to print upAxis

ARKit: How can I add a UIView to ARKit Scene?

倾然丶 夕夏残阳落幕 提交于 2019-11-30 07:12:57
I am working on a AR project using ARKit. I want to add a UIView to ARKit Scene. When I tap on a object, I want to get information as a "pop-up" next to the object. This information is in a UIView. Is it possible to add this UIView to ARKit Scene? I set up this UIView as a scene and and what can I do then? Can I give it a node and then add it to the ARKit Scene? If so, how it works? Or is there another way? Thank you! EDIT: Code of my SecondViewController class InformationViewController: UIViewController { @IBOutlet weak var secondView: UIView! override func viewDidLoad() { super.viewDidLoad()

iOS11 ARKit: Can ARKit also capture the Texture of the user's face?

ⅰ亾dé卋堺 提交于 2019-11-30 05:29:39
I read the whole documentation on all ARKit classes up and down. I don't see any place that describes ability to actually get the user face's Texture. ARFaceAnchor contains the ARFaceGeometry (topology and geometry comprised of vertices) and the BlendShapeLocation array (coordinates allowing manipulations of individual facial traits by manipulating geometric math on the user face's vertices). But where can I get the actual Texture of the user's face. For example: the actual skin tone / color / texture, facial hair, other unique traits, such as scars or birth marks? Or is this not possible at

ARKit and Vuforia - marker recognition

不羁岁月 提交于 2019-11-30 05:23:10
I'm working on an iOS app, I need to recognize a marker (most likely it will be QR code) and place some 3D content over it using ARKit. I was thinking about a combination of Vuforia and ARKit. Is it possible to use Vuforia only to recognize the marker and get its position, and then "pass" this data to ARKit? I need to recognize the marker in order to select corresponding 3D content. I need to get the position of the marker only ones, in order to place 3D content there, after that I want to use ARKit for tracking. Is it possible? Is there another solution for marker recognition which can be

Get camera field of view in iOS 11 ARKit

一世执手 提交于 2019-11-30 02:19:17
I'm using a ARSCNView from ARKit to display live a video feed from the camera on the iPad. I have the ARSCNView object setup exactly as Xcode's Augmented Reality App template. I was wondering if there is a way to get the field of view of the camera? @IBOutlet var sceneView: ARSCNView! func start() { sceneView.delegate = self sceneView.session.run(ARWorldTrackingConfiguration()) // Retrieve camera FOV here } There are a couple of ways to go here, and a possible false start to beware of. ⚠️ ARKit + SceneKit (incorrect) If you're already working with ARKit via SceneKit ( ARSCNView ), you might

Convert matrix_float4x4 to x y z space

微笑、不失礼 提交于 2019-11-30 00:35:09
I'm using ARKit and trying to get the position of the camera as a rotation and (x,y,z) coordinates in real world space. All I can manage to get is a matrix_float4x4 , which I don't really understand, and euler angles only show the rotation. Here's what I currently have: let transform = sceneView.session.currentFrame?.camera.transform let eulerAngles = sceneView.session.currentFrame?.camera.eulerAngles Here's the output I'm getting: eulerAngles: float3(-0.694798, -0.0866041, -1.68845) transform: __C.simd_float4x4(columns: (float4(-0.171935, -0.762872, 0.623269, 0.0), float4(0.982865, -0.0901692

ARKit save object position and see it in any next session

 ̄綄美尐妖づ 提交于 2019-11-30 00:05:25
问题 I am working for a project using ARKit. I need to save an object position and I want to see it in my next application launch where ever it was. For example in my office I attached some text on a door and come back to home and next day I wish to see that text on that place where it was is it possible in ARKit. 回答1: In iOS 12: Yes! "ARKit 2", aka ARKit for iOS 12, adds a set of features Apple calls "world map persistence and sharing". You can take everything ARKit knows about its local