arkit

ARKit - Projection of ARAnchor to 2D space

早过忘川 提交于 2019-11-29 21:44:45
问题 I am trying to project an ARAnchor to the 2D space but I am facing on an orientation issue... Below my function to project the top left, top right, bottom left, bottom right corner position to 2D space: /// Returns the projection of an `ARImageAnchor` from the 3D world space /// detected by ARKit into the 2D space of a view rendering the scene. /// /// - Parameter from: An Anchor instance for projecting. /// - Returns: An optional `CGRect` corresponding on `ARImageAnchor` projection. internal

ARKit 1.5 how to get the rotation of a vertical plane

依然范特西╮ 提交于 2019-11-29 19:19:56
问题 I'm experimenting with the vertical plane, and I'm trying to place a node on a wall with the correct rotation based on that vertical plane. here's the ARHitTestResult of the vertical plane that gets tapped: let hitLocation = sceneView.hitTest(touchPoint, types: .existingPlaneUsingExtent) I've tried the following: let hitRotation = hitLocation.first?.worldTransform.columns.2 and let anchor = hitLocation.first?.anchor let hitRotation = anchor?.transform.columns.2 neither one of them seem to

ARKit –Drop a shadow of 3D object on the plane surface

别等时光非礼了梦想. 提交于 2019-11-29 16:42:59
This is the function that I use to display object on the plane surface. private func loadScene(path: String) -> SCNNode { let spotLight = SCNLight() spotLight.type = SCNLight.LightType.probe spotLight.spotInnerAngle = 30.0 spotLight.spotOuterAngle = 80.0 spotLight.castsShadow = true let result = SCNNode() result.light = spotLight result.position = SCNVector3(-10.0, 20.0, 10.5) result.addChildNode(result) let scene = SCNScene(named: path)! for node in scene.rootNode.childNodes { result.addChildNode(node) } return result } I want to display shadow on the plane surface like this image. When I set

Understand coordinate spaces in ARKit

廉价感情. 提交于 2019-11-29 09:38:05
问题 I've read all Apple guides about ARKit, and watched a WWDC video. But I can't understand how do coordinate systems which are bind to: A real world A device A 3D scene connect to each other. I can add an object, for example a SCNPlane : let stripe = SCNPlane(width: 0.005, height: 0.1) let stripeNode = SCNNode(geometry: stripe) scene.rootNode.addChildNode(stripeNode) This will produce a white stripe, which will be oriented vertically, no matter how the device will be oriented at that moment.

Xcode simd - issue with Translation and Rotation Matrix Example

十年热恋 提交于 2019-11-29 08:58:46
Not only is using column-major vs row-major counter-intuitive, Apple's documentation on "Working with Matrices" further exacerbates the confusion by their examples of "constructing" a "Translate Matrix" and a "Rotation Matrix" in 2D. Translate Matrix Per Apple's Documentation () Translate A translate matrix takes the following form: 1 0 0 0 1 0 tx ty 1 The simd library provides constants for identity matrices (matrices with ones along the diagonal, and zeros elsewhere). The 3 x 3 Float identity matrix is matrix_identity_float3x3. The following function returns a simd_float3x3 matrix using the

ARKit: How can I add a UIView to ARKit Scene?

只谈情不闲聊 提交于 2019-11-29 08:47:03
问题 I am working on a AR project using ARKit. I want to add a UIView to ARKit Scene. When I tap on a object, I want to get information as a "pop-up" next to the object. This information is in a UIView. Is it possible to add this UIView to ARKit Scene? I set up this UIView as a scene and and what can I do then? Can I give it a node and then add it to the ARKit Scene? If so, how it works? Or is there another way? Thank you! EDIT: Code of my SecondViewController class InformationViewController:

Reliable access and modify captured camera frames under SceneKit

六月ゝ 毕业季﹏ 提交于 2019-11-29 08:21:28
I try to add a B&W filter to the camera images of an ARSCNView, then render colored AR objects over it. I'am almost there with the following code added to the beginning of - (void)renderer:(id<SCNSceneRenderer>)aRenderer updateAtTime:(NSTimeInterval)time CVPixelBufferRef bg=self.sceneView.session.currentFrame.capturedImage; if(bg){ char* k1 = CVPixelBufferGetBaseAddressOfPlane(bg, 1); if(k1){ size_t x1 = CVPixelBufferGetWidthOfPlane(bg, 1); size_t y1 = CVPixelBufferGetHeightOfPlane(bg, 1); memset(k1, 128, x1*y1*2); } } This works really fast on mobile, but here's the thing: sometimes a colored

ARKit Unable to run the session, configuration is not supported on this device

﹥>﹥吖頭↗ 提交于 2019-11-29 06:06:26
Using ARWorldTrackingSessionConfiguration get and error on iPhone 6 plus Unable to run the session, configuration is not supported on this device What can it be? Krunal Here is Apple's document for ARKit and device support ARKit requires an iOS device with an A9 or later processor. To make your app available only on devices supporting ARKit, use the arkit key in the UIRequiredDeviceCapabilities section of your app's Info.plist. If augmented reality is a secondary feature of your app, use the isSupported property to determine whether the current device supports the session configuration you

SceneKit shadow on a transparent SCNFloor()

时光怂恿深爱的人放手 提交于 2019-11-29 04:28:32
I have a floor node , on which I need to cast shadow from directional light . This node needs to be transparent (used in AR environment). And this works fine when I use ARKit , but the same setup using SceneKit shows no shadow or reflection. How can I cast a shadow in SceneKit like this? The problem with SceneKit is caused by the fact, that I set sceneView.backgroundColor = .clear - but I need this behaviour in this app. Can this be somehow avoided? Sample code, demonstrating this issue (works only on device, not in simulator): @IBOutlet weak var sceneView: SCNView! { didSet { sceneView.scene

iOS11 ARKit: Can ARKit also capture the Texture of the user's face?

谁说我不能喝 提交于 2019-11-29 04:07:12
问题 I read the whole documentation on all ARKit classes up and down. I don't see any place that describes ability to actually get the user face's Texture. ARFaceAnchor contains the ARFaceGeometry (topology and geometry comprised of vertices) and the BlendShapeLocation array (coordinates allowing manipulations of individual facial traits by manipulating geometric math on the user face's vertices). But where can I get the actual Texture of the user's face. For example: the actual skin tone / color