scenekit

How to render a UIView with transparent background on an SCNPlane in ARKit?

风格不统一 提交于 2020-01-02 04:16:09
问题 My UIView has a UIColor.clear background. I am instantiating the View Controller from a storyboard. When I set SCNPlane geometry's diffuse contents to the viewcontroller's view, the Transparent background appears solid white on the plane. here is how I set it let material = SCNMaterial() material.diffuse.contents = viewController.view planeGeometry.materials = [material] I can see the view, just the background is not transparent. I saw suggestion on other Stack overflow posts where they

Adding a spark particle sprite inside a view controller

微笑、不失礼 提交于 2020-01-01 18:24:30
问题 I created an .sks particle emitter based on the spark template. My app is a normal app (not a game). When a user clicks a button, I have a new View controller that shows modally over fullscreen so that I can blur the background. In this modal, I created a view and gave it a class of SCNView see image below: How can I load the particle .sks file to do the animation on that viewController on the Particles view? Update How to load a SceneKit particle systems in view controller? 回答1: As mentioned

Changing the volume of an SCNAudioPlayer in real time - Swift

别说谁变了你拦得住时间么 提交于 2020-01-01 14:49:13
问题 I am trying to work out how I can changed the volume of an SCNAudioPlayer in real time. Currently, I have an SCNAudioSource connected to an SCNAudioPlayer . This audio player is then assigned to a SCNNode so that my sound makes use of SceneKits spatial audio processing. As it stands, I am able to change the volume of each SCNNode using SCNAudioSource.volume triggered by the boolean variable vol . An extract of my code for this is shown below: (audioSource, audioCount) = soundFileSelect

Metal SCNProgram - can't render a SpriteKit scene that has video content

不问归期 提交于 2020-01-01 12:08:10
问题 I'm (desperately) trying to use a video as texture in a SCNScene with some fancy shader modifiers. I'd like to use a SCNProgram for that part. I've just taken the one from here: #include <metal_stdlib> using namespace metal; #include <SceneKit/scn_metal> struct MyNodeBuffer { float4x4 modelTransform; float4x4 modelViewTransform; float4x4 normalTransform; float4x4 modelViewProjectionTransform; }; typedef struct { float3 position [[ attribute(SCNVertexSemanticPosition) ]]; float2 texCoords [[

Render off-screen SCNScene into UIImage

廉价感情. 提交于 2020-01-01 10:12:29
问题 How can I render render an off-screen SCNScene into a UIImage ? I know that SCNView provides a -snapshot method, but unfortunately that doesn't work for off-screen views. A similar question have been asked before where one of the answers suggest reading the bitmap data from OpenGL using glReadPixels , but that approach doesn't work for me with an off-screen scene. I tried rendering into the context of an GLKView using SCNRenderer without success. 回答1: Swift 4 with SCNRenderer: You can use

Swift Scenekit - Centering SCNText - the getBoundingBoxMin:Max issue

左心房为你撑大大i 提交于 2020-01-01 10:06:16
问题 Having fun with the alignmentMode option on SCNText. Been googling around and it looks like there is a problem with alignmentMode and containerFrame. The alternatives I've found suggest using the get bounding box function to find the text size and then manually adjust accordingly. Great except I cant get the function to work. When I try to get the two vectors I get an error: 'SCNVector3' is not convertible to 'UnsafeMutablePointer < SCNVector3>' I get that both on the geometry and the node.

Simulating refraction in SceneKit

不想你离开。 提交于 2020-01-01 09:38:07
问题 I am trying to create an ios 9 app for a project which will visualise 3d scenes with these special kind of theoretical lenses called glenses. A ray tracing program called TIM has already been written from the ground up for simulating these glenses and more, but it's infeasable to simply port this to ios. My understanding from searching the site (i.e. this answer and many others on shaders) is that it should be possible but I'm having a hard time getting the desired effect. I decided that I

ARKit project point with previous device position

荒凉一梦 提交于 2020-01-01 07:22:22
问题 I'm combining ARKit with a CNN to constantly update ARKit nodes when they drift. So: Get estimate of node position with ARKit and place a virtual object in the world Use CNN to get its estimated 2D location of the object Update node position accordingly (to refine it's location in 3D space) The problem is that #2 takes 0,3s or so. Therefore I can't use sceneView.unprojectPoint because the point will correspond to a 3D point from the device's world position from #1. How do I calculate the 3D

SceneKit - Get the rendered scene from a SCNView as a MTLTexture without using a separate SCNRenderer

左心房为你撑大大i 提交于 2020-01-01 05:35:09
问题 My SCNView is using Metal as the rendering API and I would like to know if there's a way to grab the rendered scene as a MTLTexture without having to use a separate SCNRenderer ? Performance drops when I'm trying to both display the scene via the SCNView and re-rendering the scene offscreen to a MTLTexture via a SCNRenderer (I'm trying to grab the output every frame). SCNView gives me access to the MTLDevice , MTLRenderCommandEncoder , and MTLCommandQueue that it uses, but not to the

SceneKit - Get the rendered scene from a SCNView as a MTLTexture without using a separate SCNRenderer

蹲街弑〆低调 提交于 2020-01-01 05:35:07
问题 My SCNView is using Metal as the rendering API and I would like to know if there's a way to grab the rendered scene as a MTLTexture without having to use a separate SCNRenderer ? Performance drops when I'm trying to both display the scene via the SCNView and re-rendering the scene offscreen to a MTLTexture via a SCNRenderer (I'm trying to grab the output every frame). SCNView gives me access to the MTLDevice , MTLRenderCommandEncoder , and MTLCommandQueue that it uses, but not to the