metal

How to write a sceneKit shader modifier for a dissolve in effect

ε祈祈猫儿з 提交于 2020-07-28 06:17:07
问题 I'd like to build a dissolve in effect for a Scenekit game. I've been looking into shader modifiers since they seem to be the most light weight and haven't had any luck in replicating this effect: Is it possible to use shader modifiers to create this effect? How would you go about implementing one? 回答1: You can get pretty close to the intended effect with a fragment shader modifier. The basic approach is as follows: Sample from a noise texture If the noise sample is below a certain threshold

SwiftUI updates reduce FPS of metal window

喜欢而已 提交于 2020-07-18 19:53:47
问题 I'm experimenting with SwiftUI and Metal. I've got 2 windows, one with various lists and controls and the other a Metal window. I had the slider data updating the Metal window but when I moved the slider the FPS dropped from 60 to around 25. I removed all links between the views and moving the sliders still drops the FPS in the metal window. It seems that the list views slow down the FPS as well. I create the metal window on startup using: metalWindow = NSWindow(contentRect: NSRect(x: 0, y: 0

Resizing MTKView scales old content before redraw

淺唱寂寞╮ 提交于 2020-07-18 06:38:07
问题 I'm using a MTKView to draw Metal content. It's configured as follows: mtkView = MTKView(frame: self.view.frame, device: device) mtkView.colorPixelFormat = .bgra8Unorm mtkView.delegate=self mtkView.sampleCount=4 mtkView.isPaused=true mtkView.enableSetNeedsDisplay=true setFrameSize is overriden to trigger a redisplay. Whenever the view resizes it scales its old content before it redraws everything. This gives a jittering feeling. I tried setting the contentGravity property of the MTKView's

Resizing MTKView scales old content before redraw

自作多情 提交于 2020-07-18 06:38:05
问题 I'm using a MTKView to draw Metal content. It's configured as follows: mtkView = MTKView(frame: self.view.frame, device: device) mtkView.colorPixelFormat = .bgra8Unorm mtkView.delegate=self mtkView.sampleCount=4 mtkView.isPaused=true mtkView.enableSetNeedsDisplay=true setFrameSize is overriden to trigger a redisplay. Whenever the view resizes it scales its old content before it redraws everything. This gives a jittering feeling. I tried setting the contentGravity property of the MTKView's

MTLTexture.getbytes() returning EXC_BAD_ACCESS - Possible reasons?

安稳与你 提交于 2020-07-10 06:48:07
问题 I am using ARKit3 body tracking capabilities to overlay a 3D model on top of the camera feed and allow the user to capture the feed as an MP4 with audio. I have been able to draw to the camera preview to the MTLView successfully. However, am receiving a EXC_BAD_ACCESS error which I cannot seem to debug when trying to write to my asset writer. My App has render class which is responsible for rendering the Metal textures from the feed, and a MetalVideoRecorder class which is used to write the

MTLTexture.getbytes() returning EXC_BAD_ACCESS - Possible reasons?

风格不统一 提交于 2020-07-10 06:46:06
问题 I am using ARKit3 body tracking capabilities to overlay a 3D model on top of the camera feed and allow the user to capture the feed as an MP4 with audio. I have been able to draw to the camera preview to the MTLView successfully. However, am receiving a EXC_BAD_ACCESS error which I cannot seem to debug when trying to write to my asset writer. My App has render class which is responsible for rendering the Metal textures from the feed, and a MetalVideoRecorder class which is used to write the

Has video chat for GoogleWebRTC broken with iOS 13.4?

安稳与你 提交于 2020-07-09 11:52:06
问题 I have got video chat working via GoogleWebRTC with my old iPhone 6 but no matter what I do I cannot seem to get either the incoming or outgoing video to render on my iPhone XS running iOS 13.4. This is after 10 straight days effort of trying to make it work. I have seen one or two other posts regarding people who can't seem to get it to work and I was just wondering if anyone out there CAN get video chat to work via GoogleWebRTC on an iOS 13.4 device. With the release of iOS 13.4 perhaps

MPSImageHistogramEqualization throws assertion that offset must be < [buffer length]

怎甘沉沦 提交于 2020-06-29 03:50:42
问题 I'm trying to do histogram equalization using MPSImageHistogramEqualization on iOS but it ends up throwin an assertion I do not understand. Here is my code: // Calculate Histogram var histogramInfo = MPSImageHistogramInfo( numberOfHistogramEntries: 256, histogramForAlpha: false, minPixelValue: vector_float4(0,0,0,0), maxPixelValue: vector_float4(1,1,1,1)) let calculation = MPSImageHistogram(device: self.mtlDevice, histogramInfo: &histogramInfo) let bufferLength = calculation.histogramSize

How do I change the pixelFormat in Metal?

旧时模样 提交于 2020-06-28 03:53:30
问题 If I try anyting other than bgra8Unorm , it will crash, saying, -[MTLDebugRenderCommandEncoder validateFramebufferWithRenderPipelineState:]:1192: failed assertion `For color attachment 0, the render pipeline's pixelFormat (MTLPixelFormatBGRA8Unorm_sRGB) does not match the framebuffer's pixelFormat (MTLPixelFormatBGRA8Unorm).' How do I change the framebuffer's pixelFormat then? I want to be able to do this: PipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm_srgb Instead of this

what the purpose of declaring a variable with “const constant”?

蓝咒 提交于 2020-06-27 18:41:26
问题 In Metal shader, What is the purpose of declaring a variable like const constant Vertex *vertexArray [[buffer(0)]] (with const constant I mean)? why constant alone is not enough? also, what is the difference between constant and const ? Also in the same way what is the difference between const device and constant ? 回答1: const is a type qualifier . constant and device are address spaces . const prevents you from modifying the thing to which it applies: int a = 15; a = 16; // Fine; reassigning