arkit

ARKIT : place object on a plane doesn't work properly

这一生的挚爱 提交于 2019-12-10 11:07:52
问题 I am learning ARKit and trying to place an object on a detected plane. But it doesn't work properly and there's a space between the plane and the 3D object. here's my code for the plane detection : func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { position = SCNVector3Make(anchor.transform.columns.3.x, anchor.transform.columns.3.y, anchor.transform.columns.3.z) guard let planeAnchor = anchor as? ARPlaneAnchor else { return } let plane = SCNPlane(width:

Can ARKit display WKWebview?

让人想犯罪 __ 提交于 2019-12-10 11:07:51
问题 Tried ARKit to show WKWebView and all I got is a page that shows background but no foreground. The page can be scrolled though. I attached Apple web page as screenshot. Here's the code: DispatchQueue.main.async { let webView = WKWebView(frame: CGRect(x: 0, y: 0, width: 640, height: 480)) let request = URLRequest(url: URL(string: "https://www.apple.com")!) webView.load(request) plane.firstMaterial?.diffuse.contents = webView } Does that mean ARKit cannot display web page? I am using iOS 12

Difficulty getting depth of face landmark points from 2D regions on iPhone X (SceneKit/ARKit app)

岁酱吖の 提交于 2019-12-10 11:03:44
问题 I'm running face landmark detection using the front-facing camera on iPhone X, and am trying very hard to get 3D points of face landmarks (VNFaceLandmarkRegion2D gives image coordinates X, Y only). I've been trying to use either the ARSCNView.hitTest or ARFrame.hitTest , but am so far unsuccessful. I think my error may be in converting the initial landmark points to the correct coordinate system. I've tried quite a few permutations, but currently based on my research this is what I've come up

Take a Video with ARKIT

ぐ巨炮叔叔 提交于 2019-12-10 10:20:57
问题 Hello Community , I try to build a App with Swift 4 and the great upcoming ARKit-Framework but I am stuck. I need to take a Video with the Framework or at least a UIImage-sequence but I dont know how. This is what I've tried : In ARKit you have a session which tracks your world. This session has a capturedImage instance where you can get the current Image. So I createt a Timer which appends the capturedImage every 0.1s to a List. This would work for me but If I start the Timer by clicking a

ARKit estimatedVerticalPlane hit test get plane rotation

杀马特。学长 韩版系。学妹 提交于 2019-12-10 10:17:01
问题 I am using ARKit to detect walls at runtime, I use a hit test of type .estimatedVerticalPlane when some point of the screen is touched. I am trying to apply Y rotation to node corresponding to the detected plane orientation. I want to compute the rotation in : private func computeYRotationForHitLocation(hitTestResult: ARHitTestResult) -> Float { guard hitTestResult.type == .estimatedVerticalPlane else { return 0.0 } // guard let planeAnchor = hitTestResult.anchor as? ARPlaneAnchor else {

ARKit - place a SCNPlane between 2 vector points on a plane in Swift 3 [duplicate]

﹥>﹥吖頭↗ 提交于 2019-12-10 10:14:17
问题 This question already has an answer here : Scenekit shape between 4 points (1 answer) Closed last year . Similar to some of the measuring apps you can see being demonstrated in ARKit, I have a plane with 2 marker nodes on it and a line drawn between the 2. What I need though is an SCNPlane between the 2. So, if your original was the floor and you put a marker either side of a wall, you could represent the physical wall with a SCNPlane in your AR world. Currently I'm placing the line with the

Why is SCNNode “jiggling” when dropped onto SCNPlane?

眉间皱痕 提交于 2019-12-10 07:39:08
问题 I have a SCNPlane that is added to the scene when a sufficient area is detected for a horizontal surface. The plane appears to be placed in a correct spot, according to the floor/table it's being placed on. The problem is when I drop a SCNNode(this has been consistent whether it was a box, pyramid, 3D-model, etc.) onto the plane, it will eventually find a spot to land and 99% start jiggling all crazy. Very few times has it just landed and not moved at all. I also think this may be cause by

Render a 3D model (hair) with semi-transparent texture in SceneKit?

微笑、不失礼 提交于 2019-12-10 00:19:40
问题 I'm trying to render a 3D model in SceneKit but it looks incorrect. For example this model (it's an SCN file with texture and you can reproduce it in your Xcode): In Xcode Scene Editor it is rendered like this: Transparency -> Mode -> Dual Layer Double Sided = true If I turn off the "Write depth" option, it will look like this: But there are also some issues because it I see only "the lowest layer" of haircut. I think this should be possible. How to do it right? 回答1: The reason that in your

Color keying video with GPUImage on a SCNPlane in ARKit

南楼画角 提交于 2019-12-10 00:02:54
问题 I am trying to play a video, showing transparency in an ARSCNView . A SCNPlane is used as a projection space for the video and I am trying to color key this video with GPUImage . I followed this example here. Unfortunately, I have not found a way to project that video back on my videoSpriteKitNode . Because the filter is rendered in a GPUImageView , and the SKVideoNode takes a AVPlayer . I am not sure if it is possible at all, what I am trying to do, so if anyone could share their insight I'd

How do you play a video with alpha channel using AVFoundation?

家住魔仙堡 提交于 2019-12-09 12:11:44
问题 I have an AR application which uses SceneKit , and imports a video on to scene using AVPlayer and thereby adding it as a child node of an SKVideo node. The video is visible as it is supposed to, but the transparency in the video is not achieved. Code as follows: let spriteKitScene = SKScene(size: CGSize(width: self.sceneView.frame.width, height: self.sceneView.frame.height)) spriteKitScene.scaleMode = .aspectFit guard let fileURL = Bundle.main.url(forResource: "Triple_Tap_1", withExtension: