metalkit

Blend Mode in Metal

别等时光非礼了梦想. 提交于 2021-02-20 06:30:24
问题 These are the two the blend-mode i used in OpenGL what is the conversion to the metal in IOS glEnable(GL_BLEND); glBlendFuncSeparate(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA,GL_ONE,GL_ONE_MINUS_SRC_ALPHA); glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE, GL_SRC_ALPHA, GL_ONE); 回答1: You configure blending on your render pipeline descriptor . I believe the equivalent configurations for your GL code are: // glEnable(GL_BLEND) renderPipelineDescriptor.colorAttachments[0].isBlendingEnabled = true //

Very slow framerate with AVFoundation and Metal in MacOS

送分小仙女□ 提交于 2021-02-17 05:56:47
问题 I'm trying to adapt Apple's AVCamFilter sample to MacOS. The filtering appears to work, but rendering the processed image through Metal gives me a framerate of several seconds per frame. I've tried different approaches, but have been stuck for a long time. This is the project AVCamFilterMacOS - Can anyone with better knowledge of AVFoundation with Metal tell me what's wrong? I've been reading the documentation and practicing getting the unprocessed image to display, as well as rendering other

Very slow framerate with AVFoundation and Metal in MacOS

眉间皱痕 提交于 2021-02-17 05:56:15
问题 I'm trying to adapt Apple's AVCamFilter sample to MacOS. The filtering appears to work, but rendering the processed image through Metal gives me a framerate of several seconds per frame. I've tried different approaches, but have been stuck for a long time. This is the project AVCamFilterMacOS - Can anyone with better knowledge of AVFoundation with Metal tell me what's wrong? I've been reading the documentation and practicing getting the unprocessed image to display, as well as rendering other

Is drawing to an MTKView or CAMetalLayer required to take place on the main thread?

笑着哭i 提交于 2021-02-07 04:17:01
问题 It's well known that updating the user interface in AppKit or UIKit is required to take place on the main thread. Does Metal have the same requirement when it comes to presenting a drawable ? In a layer-hosted NSView that I've been playing around with, I've noticed that I can call [CAMetalLayer nextDrawable] from a dispatch_queue that is not the main_queue . I can then update that drawable's texture as usual and present it. This appears to work properly, but I find that rather suspicious.

Is drawing to an MTKView or CAMetalLayer required to take place on the main thread?

ぐ巨炮叔叔 提交于 2021-02-07 04:16:24
问题 It's well known that updating the user interface in AppKit or UIKit is required to take place on the main thread. Does Metal have the same requirement when it comes to presenting a drawable ? In a layer-hosted NSView that I've been playing around with, I've noticed that I can call [CAMetalLayer nextDrawable] from a dispatch_queue that is not the main_queue . I can then update that drawable's texture as usual and present it. This appears to work properly, but I find that rather suspicious.

Why is the triangle being rendered with rough edges and not smooth edges? Metal, Swift, Xcode

爱⌒轻易说出口 提交于 2021-01-29 12:19:53
问题 I am using this code to render the "Hello Triangle" triangle. On my iPhone, though, the triangle has very rough edges, not smooth edges, like in the example. import UIKit import Metal import MetalKit import simd class MBEMetalView: UIView { // // // // // MAIN // // // // // var metalDevice: MTLDevice! = nil var metalLayer: CAMetalLayer! = nil var commandQueue: MTLCommandQueue! = nil var vertexBuffer: MTLBuffer! = nil var pipelineState: MTLRenderPipelineState! = nil var displayLink:

Resulting MTLTexture lighter than CGImage

若如初见. 提交于 2021-01-27 11:41:52
问题 I have kernel func which must convert Y and CbCr textures created from pixelBuffer(ARFrame.capturedImage) to RGB texture like in apple guide https://developer.apple.com/documentation/arkit/displaying_an_ar_experience_with_metal But I get over lighted texture kernel void renderTexture(texture2d<float, access::sample> capturedImageTextureY [[ texture(0) ]], texture2d<float, access::sample> capturedImageTextureCbCr [[ texture(1) ]], texture2d<float, access::read_write> outTextue [[texture(2)]],

Antialiasing a SceneKit rendering with Metal

﹥>﹥吖頭↗ 提交于 2021-01-07 03:52:26
问题 I'm new to Metal. I'm rendering a SceneKit scene with Metal using this Apple sample code. TLDR; it calls the SCNRenderer's render function and passes in a command buffer. I'm compiling for Big Sur. It works, but it is not anti-aliased. I've tried a few ways to achieve it, as you can see in the updates below. Without Metal, I'd just set isJitteringEnabled to true on the SCNRenderer, and I get beautiful (and slow) 96-ish-pass renderings. If I try to do this with Metal, I get weird pixel format

what the purpose of declaring a variable with “const constant”?

蓝咒 提交于 2020-06-27 18:41:26
问题 In Metal shader, What is the purpose of declaring a variable like const constant Vertex *vertexArray [[buffer(0)]] (with const constant I mean)? why constant alone is not enough? also, what is the difference between constant and const ? Also in the same way what is the difference between const device and constant ? 回答1: const is a type qualifier . constant and device are address spaces . const prevents you from modifying the thing to which it applies: int a = 15; a = 16; // Fine; reassigning