metal

Efficiently copying Swift Array to memory buffer for iOS Metal

谁说我不能喝 提交于 2019-12-20 15:43:08
问题 I am writing an iOS application using Apple's new Metal framework. I have an array of Matrix4 objects (see Ray Wenderlich's tutorial) that I need to pass in to a shader via the MTLDevice.newBufferWithLength() method. The Matrix4 object is leveraging Apple's GLKit (it contains a GLKMatrix4 object). I'm leveraging instancing with the GPU calls. I will later change this to a struct which includes more data per instance (beyond just the Matrix4 object. How can I efficiently copy the array of

Screen tearing and camera capture with Metal

╄→гoц情女王★ 提交于 2019-12-20 09:40:17
问题 To avoid writing to a constant buffer from both the gpu and cpu at the same time, Apple recommends using a triple-buffered system with the help of a semaphore to prevent the cpu getting too far ahead of the gpu (this is fine and covered in at least three Metal videos now at this stage). However, when the constant resource is an MTLTexture and the AVCaptureVideoDataOutput delegate runs separately than the rendering loop (CADisplaylink), how can a similar triple-buffered system (as used in

how to display MTKView with rgba16Float MTLPixelFormat

可紊 提交于 2019-12-20 04:54:25
问题 I have an MTKView set to use MTLPixelFormat.rgba16Float. I'm having display issues which can be best described with the following graphic: So the intended UIColor becomes washed out, but only while it is being displayed in MTKView. When I convert the drawable texture back to an image for display in a UIView via CIIMage, I get back the original color. Here is how I create that output: let colorSpace = CGColorSpaceCreateDeviceRGB() let kciOptions = [kCIImageColorSpace: colorSpace,

Scenekit SCNShadable accessing _surface.diffuseTexcoord paints the object white

我的未来我决定 提交于 2019-12-20 04:12:30
问题 I'm experiencing odd behaviour when trying to extend a PBR material from scenekit. All i want to do is read a texture, map it using the first uv channel (same as normal). As soon as i mention _surface.diffuseTexcoord , _surface.diffuse seems to turn to white. It doesn't seem to be constant ( _output.rgb = vec3(1.) ) but rather color white is passed through the lighting pipeline. let myShader = "#pragma arguments\n" + "sampler uMaskTex;\n" + "uniform sampler2D uMaskTex;\n" + "#pragma body\n" +

Scenekit SCNShadable accessing _surface.diffuseTexcoord paints the object white

故事扮演 提交于 2019-12-20 04:12:13
问题 I'm experiencing odd behaviour when trying to extend a PBR material from scenekit. All i want to do is read a texture, map it using the first uv channel (same as normal). As soon as i mention _surface.diffuseTexcoord , _surface.diffuse seems to turn to white. It doesn't seem to be constant ( _output.rgb = vec3(1.) ) but rather color white is passed through the lighting pipeline. let myShader = "#pragma arguments\n" + "sampler uMaskTex;\n" + "uniform sampler2D uMaskTex;\n" + "#pragma body\n" +

How to convert a MTLTexture to CVpixelBuffer to write into an AVAssetWriter?

∥☆過路亽.° 提交于 2019-12-19 09:25:07
问题 I have a requirement to apply filters on the live video and I'm trying to do it in Metal. But I have encountered problem with converting the MTLTexture into CVPixelBuffer after encoding the filter into destination filter. Reference (https://github.com/oklyc/MetalCameraSample-master-2) Here are my codes. if let pixelBuffer = pixelBuffer { CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.init(rawValue: 0)) let region = MTLRegionMake2D(0, 0, Int(currentDrawable.layer.drawableSize

How to move a rotated SCNNode in SceneKit?

♀尐吖头ヾ 提交于 2019-12-19 06:19:17
问题 The image below shows a rotated box that should be moved horizontally on the X and Z axes. Y should stay unaffected to simplify the scenario. The box could also be the SCNNode of the camera, so I guess a projection does not make sense at this point. So lets say we want to move the box in the direction of the red arrow. How to achieve this using SceneKit? The red arrow indicates -Z direction of the box. It also shows us it is not parallel to the camera's projection or to the global axes that

Creating a custom SCNGeometry polygon plane with SCNGeometryPrimitiveType polygon crash/error

ε祈祈猫儿з 提交于 2019-12-18 13:39:10
问题 I'm trying to create a custom SCNGeometry in the form of a plane with custom shape, which could be placed in an ARKit session. I'm using the option SCNGeometryPrimitiveTypePolygon in the following method which seems to work fine: extension SCNGeometry { static func polygonPlane(vertices: [SCNVector3]) -> SCNGeometry { var indices: [Int32] = [Int32(vertices.count)] var index: Int32 = 0 for _ in vertices { indices.append(index) index += 1 } let vertexSource = SCNGeometrySource(vertices:

Metal Shader with SceneKit SCNProgram

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-18 12:00:52
问题 I'm looking for just a working Metal shader that works in SceneKit with SCNProgram. Can someone show me the correct method declarations/how to hook this up? let program = SCNProgram() program.vertexFunctionName = "myVertex" program.fragmentFunctionName = "myFragment" material.program = program and then the shader //MyShader.metal vertex something myVertex(something) { return something; } fragment float4 myFragment(something) { return something } I'm just looking for the most basic example

Metal Shader with SceneKit SCNProgram

强颜欢笑 提交于 2019-12-18 12:00:49
问题 I'm looking for just a working Metal shader that works in SceneKit with SCNProgram. Can someone show me the correct method declarations/how to hook this up? let program = SCNProgram() program.vertexFunctionName = "myVertex" program.fragmentFunctionName = "myFragment" material.program = program and then the shader //MyShader.metal vertex something myVertex(something) { return something; } fragment float4 myFragment(something) { return something } I'm just looking for the most basic example