metalkit

How to render each pixel of a bitmap texture to each native physical pixel of the screen on macOS?

不羁岁月 提交于 2019-12-02 13:51:11
As modern macOS devices choose to use a scaled HiDPI resolution by default, bitmap images get blurred on screen. Is there a way to render a bitmap pixel by pixel to the true native physical pixels of the display screen? Any CoreGraphics, OpenGL, or metal API that would allow this without change the display mode of the screen? If you are thinking of those convertXXXXToBacking and friends, stop. Here is the explanation for you. A typical 13 in MacBook pro now has native 2560x1600 pixel resolution. The default recommended screen resolution is 1440x900 after fresh macOS install. The user can

Metal RGB to YUV conversion compute shader

孤者浪人 提交于 2019-12-02 08:53:43
问题 I am trying to write a Metal compute shader for converting from RGB to YUV, but am getting build errors. typedef struct { float3x3 matrix; float3 offset; } ColorConversion; // Compute kernel kernel void kernelRGBtoYUV(texture2d<half, access::sample> inputTexture [[ texture(0) ]], texture2d<half, access::write> textureY [[ texture(1) ]], texture2d<half, access::write> textureCbCr [[ texture(2) ]], constant ColorConversion &colorConv [[ buffer(0) ]], uint2 gid [[thread_position_in_grid]]) { //

Metal RGB to YUV conversion compute shader

混江龙づ霸主 提交于 2019-12-02 03:19:33
I am trying to write a Metal compute shader for converting from RGB to YUV, but am getting build errors. typedef struct { float3x3 matrix; float3 offset; } ColorConversion; // Compute kernel kernel void kernelRGBtoYUV(texture2d<half, access::sample> inputTexture [[ texture(0) ]], texture2d<half, access::write> textureY [[ texture(1) ]], texture2d<half, access::write> textureCbCr [[ texture(2) ]], constant ColorConversion &colorConv [[ buffer(0) ]], uint2 gid [[thread_position_in_grid]]) { // Make sure we don't read or write outside of the texture if ((gid.x >= inputTexture.get_width()) || (gid

Texture Brush (Drawing Application ) Using Metal

跟風遠走 提交于 2019-12-01 01:54:34
I am trying to implement a metal-backed drawing application where brushstrokes are drawn on an MTKView by textured square repeatedly along a finger position. I am drawing this with alpha 0.2. When the squares are overlapped the color is added. How can I draw with alpha 0.2. I think you need to draw the brush squares to a separate texture, initially cleared to transparent, without blending. Then draw that whole texture to your view with blending. If you draw the brush squares directly to the view, then they will accumulate. After you draw square 1, it's part of the image. Metal can no longer

Off Screen Rendering Metal

只愿长相守 提交于 2019-11-29 15:48:34
问题 func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) { print("current drawable size:\(view.drawableSize)") } func draw(in view: MTKView) { guard let drawable = view.currentDrawable else { return } let textureDescriptor = MTLTextureDescriptor() textureDescriptor.textureType = MTLTextureType.type2D textureDescriptor.width = drawable.texture.width textureDescriptor.height = drawable.texture.height textureDescriptor.pixelFormat = .bgra8Unorm textureDescriptor.storageMode = .shared

CIImage display MTKView vs GLKView performance

喜你入骨 提交于 2019-11-27 07:03:41
问题 I have a series of UIImages (made from incoming jpeg Data from server) that I wish to render using MTKView. Problem is it is too slow compared to GLKView. There is lot of buffering and delay when I have a series of images to display in MTKView but no delay in GLKView. Here is MTKView display code: private lazy var context: CIContext = { return CIContext(mtlDevice: self.device!, options: [CIContextOption.workingColorSpace : NSNull()]) }() var ciImg: CIImage? { didSet { syncQueue.sync {

How to apply a Vignette CIFilter to a live camera feed in iOS?

喜你入骨 提交于 2019-11-26 23:09:44
While trying to apply a simple vignette filter to the raw camera feed of an iPhone6, with the help of Metal and Core Image, I see a lot of lag between the frames being processed and rendered in an MTKView The approach which I have followed is (MetalViewController.swift): Get raw camera output using AVCaptureVideoDataOutputSampleBufferDelegate Convert CMSampleBuffer > CVPixelBuffer > CGImage Create an MTLTexture with this CGImage . Point no. 2 and 3 are inside the method named: fillMTLTextureToStoreTheImageData Apply a CIFilter to the CIImage fetched from the MTLTexture in the MTKViewDelegate

How to apply a Vignette CIFilter to a live camera feed in iOS?

孤人 提交于 2019-11-26 08:34:45
问题 While trying to apply a simple vignette filter to the raw camera feed of an iPhone6, with the help of Metal and Core Image, I see a lot of lag between the frames being processed and rendered in an MTKView The approach which I have followed is (MetalViewController.swift): Get raw camera output using AVCaptureVideoDataOutputSampleBufferDelegate Convert CMSampleBuffer > CVPixelBuffer > CGImage Create an MTLTexture with this CGImage . Point no. 2 and 3 are inside the method named: