scenekit

How to draw camera video as background on IOS using Swift scenekit?

你离开我真会死。 提交于 2020-01-01 03:20:12
问题 I am trying to develop an augmented reality app using swift and scenekit on ios. Is there a way to draw the video captured by the device camera as a background of the scene? 回答1: This worked for me, I used AVFoundation to capture the video input of the device camera: let captureSession = AVCaptureSession() let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill if let videoDevice = AVCaptureDevice

How to use SceneKit to display a colorful point cloud using custom SCNGeometry?

﹥>﹥吖頭↗ 提交于 2020-01-01 03:10:10
问题 I'm current working on developing OS X software using Swift to display a point cloud reading from a PLY file. The file is converted to an array of PointCloudVertex which is defined as follows: struct PointCloudVertex { let x: Double, y: Double, z: Double let r: Double, g: Double, b: Double } Then I try to build a SCNGeometry from the array: var vertices = [PointCloudVertex]() for p in points { let v = PointCloudVertex(x: p[0], y: p[1], z: p[2], r: p[3], g: p[4], b: p[5]) vertices.append(v) }

How to improve People Occlusion in ARKit 3.0

爷,独闯天下 提交于 2020-01-01 03:04:07
问题 We are working on a demo app using people occlusion in ARKit. Because we want to add videos in the final scene, we use SCNPlane s to render the video using a SCNBillboardConstraint to ensure they are facing the right way. These videos are also partially transparent, using a custom shader on the SCNMaterial we apply (thus playing 2 videos at once). Now we have some issues where the people occlusion is very iffy (see image). The video we are using to test is a woman with a dark pants and skirt

Set CALayer as SCNMaterial's diffuse contents

泪湿孤枕 提交于 2020-01-01 02:44:10
问题 I've been searching all over the internet over the past couple of days to no avail. Unfortunately, the apple documentation about this specific issue is vague and no sample code is available (at least thats what I found out). What seems to be the issue you may ask... I'm trying to set a uiview's layer as the contents of the material that is used to render an iPhone model's screen (Yep, trippy :P ). The iPhone's screen's UV mapping is set from 0 to 1 so that no issue persists in mapping the

Check whether the ARReferenceImage is no longer visible in the camera's view

非 Y 不嫁゛ 提交于 2019-12-31 08:51:33
问题 I would like to check whether the ARReferenceImage is no longer visible in the camera's view. At the moment I can check if the image's node is in the camera's view, but this node is still visible in the camera's view when the ARReferenceImage is covered with another image or when the image is removed. func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) { guard let node = self.currentImageNode else { return } if let pointOfView = sceneView.pointOfView { let isVisible =

CIFilter on SCNNode only works in Simulator

耗尽温柔 提交于 2019-12-31 03:14:08
问题 Having a bit of a head-scratcher with the new filter features in SceneKit with iOS9. To simplify the case as much as possible, if I add the line: ship.filters = [CIFilter(name: "CIPixellate", withInputParameters: [kCIInputScaleKey: 30])!] To the default SceneKit project, build and run for simulator, the ship pixellates as expected. If I build and run the same project on-device, the ship disappears. I've tried various permutations, multiple devices, and even several SceneKit demonstrations on

Scenekit detecting User tapped object

♀尐吖头ヾ 提交于 2019-12-30 11:47:47
问题 I recently started using scenekit for scenekit in iOS 8. I am facing difficulty in detecting whether the user has tapped or pressed on the object. Is there any way to do that? 回答1: See the documentation for the hitTest method. Call that from wherever you're handling touch events to get a list of 3D scene objects/locations "under" a 2D screen point. 回答2: An easy way to get sample code that shows the hitTest in action is to create a sample app using the Game template in XCode6. Create a new

Throttle CPU usage on background thread

好久不见. 提交于 2019-12-30 07:59:08
问题 I have a CPU intensive task and I want it to use less CPU and take up more time. I'm loading a massive amount of SCNNodes to a scene at start-up. It takes up a lot of memory, and I'd like it to work on it at a safe rate instead of lagging up my system or potentially crashing it. Here's the code I'm using to load the nodes. dispatch_async(dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^(void){ NSLog(@"Start Loading Level"); SCNNode *cameraNode = [SCNNode node]; cameraNode

Converting matrix_float3x3 rotation to SceneKit

随声附和 提交于 2019-12-30 04:52:05
问题 I was experimenting with GameplayKit’s GKAgent3D class to move a SCNNode within a scene. I was able to update the SCNNode with the agent’s position, but not rotation. The issue being the agent’s rotation is stored as a matrix_float3x3, which doesn’t match any of data types SceneKit uses for storing rotational information. So what I’d like to know is if there’s a simple function or method that could convert a rotation stored as matrix_float3x3 to any SceneKit data types? 回答1: To expand on

How to create a border for SCNNode to indicate its selection in iOS 11 ARKit-Scenekit?

假如想象 提交于 2019-12-30 04:24:08
问题 How to draw a border to highlight a SCNNode and indicate to user that the node is selected? In my project user can place multiple virtual objects and user can select any object anytime. Upon selection i should show the user highlighted 3D object. Is there a way to directly achieve this or draw a border over SCNNode? 回答1: You need to add a tap gesture recognizer to the sceneView . // add a tap gesture recognizer let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap(