arkit

SCNText - background “speech bubble”

廉价感情. 提交于 2019-12-11 05:07:02
问题 How can I insert a background (e.g. a "speech bubble" or a rectangle) to a SCNtext? Specifically, if I insert "Hello World" as SCNText (and obviously then as a SCNNode in the scene) then how can I add a background for that text only? Would it be a UIimage which will be inserted as a SCNNode at the same position in the "Hello World? (Keep in mind that there is nothing as background of SCNNode in SceneKit) 回答1: You could use a SCNPlane as another SCNNode, assign a SCNMaterial to its geometry,

How to add a CIFilter to MTLTexture Using ARMatteGenerator?

∥☆過路亽.° 提交于 2019-12-11 04:46:43
问题 I am working off of Apple's sample project related to using the ARMatteGenerator to generate a a MTLTexture that can be used as an occlusion matte in the people occlusion technology. I would like to determine how I could run the generated matte through a CIFilter. In my code, I am "filtering" the matte like such; func updateMatteTextures(commandBuffer: MTLCommandBuffer) { guard let currentFrame = session.currentFrame else { return } var targetImage: CIImage? alphaTexture = matteGenerator

ARKit – Render objects further away than 1000 meters

℡╲_俬逩灬. 提交于 2019-12-11 04:08:19
问题 I'm trying to render objects further away than 1000. let box = SCNBox(width: 500, height: 500, length: 500, chamferRadius: 0) let boxNode = SCNNode(geometry: box) boxNode.position = SCNVector3(0, 0, -2000) sceneView.scene.rootNode.addChildNode(boxNode) From this this answer I know that ARKit directly sets SCNCamera's projectionTransform. So is there anyway I change this projectionTransform in order to render objects further away? 回答1: In ARKit_2.0 / SceneKit_2018 , if a distance from ARCamera

GPUImageView inside SKScene as SKNode material - Playing transparent video on ARKit

a 夏天 提交于 2019-12-11 01:58:15
问题 In Project -A- I used GPUImageView to display Video (recorded on greenscreen) with transparency. Using the GPUImageChromaKeyBlendFilter, and so on. and works Superb. Another project -B- based on ARKIT shows me in the space a plain with VIDEO and it also works fine using SKVideoNode and AVPlayer. Now the question is to combine it all together in one :) So in space I want to display Video but with transparency ... Unfortunately, I can not render a GPUImageView on any SpriteKit element, and then

Using ARKit to capture high quality photos

六月ゝ 毕业季﹏ 提交于 2019-12-10 22:28:03
问题 I am interested in using ARKit's ability to track the phone's position to automatically take photos using the camera. My initial investigation led to me to understand that while ARKit is using the camera, it is not possible to get high-quality images using the standard AVFoundation methods (due to the camera being in use). I understand I can use sceneView.snapshot() , but the best quality this can provide is 1080p, which isn't high enough quality to use for my application. My question is, are

Only able to detect and track up to 4 images at a time with ARKit 3.0

这一生的挚爱 提交于 2019-12-10 13:27:22
问题 Using the code below I'm only able to detect and track up to 4 images at any one time when using ARKit. ARImageTrackingConfiguration *configuration = [ARImageTrackingConfiguration new]; configuration.trackingImages = [ARReferenceImage referenceImagesInGroupNamed:@"AR Resources" bundle:nil]; configuration.maximumNumberOfTrackedImages = 100; [self.sceneView.session runWithConfiguration:configuration]; Is anyone able to confirm what I'm seeing? I need to be able to track a larger number of

Add UIView (from xib) with transparency to SceneKit

孤街浪徒 提交于 2019-12-10 13:19:50
问题 I'm trying to load a UIView into SceneKit with a translucent background, but currently, it just fades to white as I decrease the opacity. I have a very simple UIView layout in a .xib file that I want to load into SceneKit. So far I can display the UIView in the SCNMaterial, change any text fields, images, etc inside the view without a problem. However, I cannot let it have transparency. If I change the alpha of the view it just fades to white. most of the code is the below: if let cardView =

Run and Pause an ARSession in a specified period of time

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-10 12:27:52
问题 I'm developing ARKit / Vision iOS app with gesture recognition. My app has a simple UI containing single UIView . There's no ARSCNView / ARSKView at all. I'm putting a sequence of captured ARFrames into CVPixelBuffer what then I use for VNRecognizedObjectObservation . I don't need any tracking data from a session. I just need currentFrame.capturedImage for CVPixelBuffer . And I need to capture ARFrames at 30 fps. 60 fps is excessive frame rate. preferredFramesPerSecond instance property is

ARKit: Plot a Node at a specific pixel at a specific Z distance from Camera

浪尽此生 提交于 2019-12-10 12:07:11
问题 Referring to the image above. I have a Red Node at the center of the screen with a distance of 1.0 unit (1 meter away) [See iPhone Portrait Top View] What I do is I capture a screenshot of the iPhone screen and the resulting image is 750 x 1334 pixels [See iPhone Portrait Front View] sceneView.snapshot() What I want to do is put 4 Red Square Nodes located on the four sides of the iPhone screen relative to the Red Circle (at the dead center of the screen). I am making this to mark where I did

Set text programmatically of an Entity in Reality Composer - iOS 13

旧巷老猫 提交于 2019-12-10 11:09:16
问题 In my iOS app I want to introduce a part of AR using the new Reality Composer. In my project I load a scene with this code: let arView = ARView.init(frame: frame) // Configure the AR session for horizontal plane tracking. let arConfiguration = ARWorldTrackingConfiguration() arConfiguration.planeDetection = .horizontal arView.session.run(arConfiguration) arView.session.delegate = self self.view.addSubview(arView) Experience.loadSceneAsync{ [weak self] scene, error in print("Error \(String