metal

Convert OpenGL shader to Metal (Swift) to be used in CIFilter

偶尔善良 提交于 2019-12-11 07:46:11
问题 I'm quite new to OpenGL / Metal and I'm trying to understand some fundamental concepts. Within our app, we are using CIFilter to filter videos. I saw a WWDC video from 2017 explaining that you can wrap CIFilter with Metal and use it as a regular filter. I'm trying to understand how to convert this OpenGL video effect to Metal so I can use it as a reference point for future effects. void mainImage(out vec4 fragColor, in vec2 fragCoord) { float amount = sin(iTime) * 0.1; // uv coords vec2 uv =

How to pass non-texture data to SCNTechnique Metal shaders

社会主义新天地 提交于 2019-12-11 07:32:56
问题 I can a pass custom parameter of type sampler2D to the Metal fragment function of an SCNTechnique and I have a working 2nd pass: PList: <key>inputs</key> <dict> <key>imageFromPass1</key> <string>COLOR</string> <key>myCustomImage</key> <string>myCustomImage_sym</string> </dict> ... <key>symbols</key> <dict> <key>myCustomImage_sym</key> <dict> <key>type</key> <string>sampler2D</string> </dict> </dict> Relevant Obj-C code: [technique setValue: UIImagePNGRepresentation(myCustomTexture) forKey:@

Metal draw one object with texture one object without texture

隐身守侯 提交于 2019-12-11 06:44:19
问题 I want to render 2 different objects with Metal...one is with texture, the other one is without texture. I have 2 different shaders, 2 different vertex descriptors, is that means i should use 2 different render pipeline? .. There is only one object drawing (the model with out texture)on the screen correct, the other one is wrong, I don't know where I went wrong.... Here is the code: override func buildPipeline() { //Model let library = device!.newDefaultLibrary()! let pipelineDescriptor =

OpenGLES blending code to Metal translation

给你一囗甜甜゛ 提交于 2019-12-11 06:36:08
问题 I have this simple OpenGLES blending code to Metal: glBlendEquation(GL_FUNC_ADD); glBlendFunc(GL_ONE, GL_ONE); glEnable(GL_BLEND); I wrote code in Metal but am confused if it is exactly does the same job. Specifically, do I need to mention alpha blending factors or no. Because I see performance of this code in Metal worse than OpenGLES which is weird. Please let me know if there is anything missing in this code. let renderPipelineDescriptorGreen = MTLRenderPipelineDescriptor()

Setting up Metal in Swift 3 on an iPhone 6s

北城以北 提交于 2019-12-11 05:41:16
问题 I've been trying to convert Apple's MetalBasicTessellation project to work in swift 3 on an iPhone 6s running iOS 10.3.1. Everything compiles with no errors, but when running on my iPhone, I get the following error when I define renderCommandEncoder: validateAttachmentOnDevice:347: failed assertion `MTLRenderPassDescriptor texture must be MTLTextureType2DMultisample when using a resolveTexture.' I have properly set the renderPassDescriptor's texture property to inherit MKTView.currentDrawable

Scenekit camera orbit around object

北城余情 提交于 2019-12-11 05:33:39
问题 Hello everyone, I come back to you about my current problem. I already asked a question about that but no one had success to help me. Then I will explain my complete problem and how I tried to fix it. (I tried several things) So, I need to code a lib that adds many functions in order to manage cameras and objects in a 3D world. For that we have chosen SceneKit Framework to use Metal. I will post a very simplified code but all necessary things are here. To illustrate my thought here is a GIF

How to compile two versions of metal files

六月ゝ 毕业季﹏ 提交于 2019-12-11 05:32:14
问题 I want to support both 10.13 and 10.14 however I want to support fast math on 10.14. I am only able to compile project if I force #define __CIKERNEL_METAL_VERSION__ 200 but this means on 10.13 it will crash. How do I configure the project so it creates 2 metal libraries? So far the result file is default.metallib (compiling using Xcode) BOOL supportsMetal; #if TARGET_OS_IOS supportsMetal = MTLCreateSystemDefaultDevice() != nil; //this forces GPU on macbook to switch immediatelly #else

Crop CMSampleBuffer and process it without converting to CGImage

痞子三分冷 提交于 2019-12-11 05:13:27
问题 I have been following the apple's live stream camera editor code to get the hold of live video editing. So far so good, but I need a way out to crop a sample buffer into 4 pieces and then process all four with different CIFilters. For instance, If the size of the image is 1000x1000, I want to crop the CMSampleBuffer into 4 images of size 250x250 and then apply unique filter to each, convert it back to CMSammpleBuffer and display on Metal View. Here is the code till which I could crop the

How to add a CIFilter to MTLTexture Using ARMatteGenerator?

∥☆過路亽.° 提交于 2019-12-11 04:46:43
问题 I am working off of Apple's sample project related to using the ARMatteGenerator to generate a a MTLTexture that can be used as an occlusion matte in the people occlusion technology. I would like to determine how I could run the generated matte through a CIFilter. In my code, I am "filtering" the matte like such; func updateMatteTextures(commandBuffer: MTLCommandBuffer) { guard let currentFrame = session.currentFrame else { return } var targetImage: CIImage? alphaTexture = matteGenerator

metal shading language - change buffer size

99封情书 提交于 2019-12-11 04:19:36
问题 Is it possible to change the buffer size at runtime? We allocate the buffer size during the register our device : device = MTLCreateSystemDefaultDevice() queue = device!.makeCommandQueue() do { let library = device!.newDefaultLibrary()! let kernel = library.makeFunction(name: "compute")! cps = try device!.makeComputePipelineState(function: kernel) } catch let e { Swift.print("\(e)") } paramBuffer = device!.makeBuffer(length: MemoryLayout<Float>.size*2, options: []) then we update it