metal

Metal render with 2 pipelines, the second object overlap the first object

泪湿孤枕 提交于 2019-12-07 21:20:01
问题 I have 2 objects, one is with texture, the other one is without texture. I use 2 shaders and I use 2 render pipelines to draw the 2 objects. the 2 object are drawn fine...but when the second obejct is drawn ...it overlaps the first object, so only the second object is on screen... don't know where goes wrong...Maybe because the pipeline is not used right, in the last part of the draw() function. the model draws properly first, but when the sky is drew, the model disappeared...Please help me,,

SCN shader modifier in metal - pass uniform to shader

眉间皱痕 提交于 2019-12-07 15:48:41
问题 I'm trying to use shaders modifiers with Metal. I cannot figure out to declare uniforms... So far my fragment modifier is: // color changes #pragma arguments float4x4 u_color_transformation; #pragma body _output.color.rgb = vec3(1.0) - (u_color_transformation * _output.color).rgb; This outputs a purple texture, with no log.. If I just have _output.color.rgb = (u_color_transformation * _output.color).rgb , things are ok. I think I'm following the doc but maybe not! The uniform is set with:

Metal kernels not behaving properly on the new MacBook Pro (late 2016) GPUs

我怕爱的太早我们不能终老 提交于 2019-12-07 05:47:44
问题 I'm working on macOS project that uses Swift and Metal for image processing on the GPU. Last week, I received my new 15-inch MacBook Pro (late 2016) and noticed something strange with my code: kernels that were supposed to write to a texture did not seem to do so... After a lot of digging, I found that the problem is related to which GPU is used by Metal (AMD Radeon Pro 455 or Intel(R) HD Graphics 530) to do the computation. Initializing the MTLDevice using MTLCopyAllDevices() returns an

CVMetalTextureCacheCreateTextureFromImage returns -6660 on macOS 10.13

南楼画角 提交于 2019-12-07 05:15:14
问题 I'm recording the screen from my iPhone device to my Mac. As a preview layer I am collecting sample buffers directly from a AVCaptureVideoDataOutput , from which I'm creating textures and rendering them with Metal . The problem I'm having is that code that worked in macOS prior to 10.13 stopped working after updating to 10.13 . Namely, CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(_currentSampleBuffer); if (!imageBuffer) return; CVPixelBufferLockBaseAddress(imageBuffer,0); size

iOS Metal line width

旧巷老猫 提交于 2019-12-07 03:35:48
问题 I would like to set the width of a line that I'm drawing in Metal. I can set the size of a point with point_size as explained here: https://developer.apple.com/library/prerelease/ios/documentation/Metal/Reference/MTLRenderCommandEncoder_Ref/index.html But, I'm not sure how it works with lines. 回答1: Short answer would be there is no way to control line width in the same way as a point size in Metal. Even in OpenGL graphics API, the function to do this (which used to exist as gllinewidth

iOS10 - can't render Sprite Kit scene within SceneKit with openGL

拥有回忆 提交于 2019-12-06 22:32:26
问题 Since I've updated to iOS 10, I'm not able anymore to render a Sprite Kit scene to a Scene Node while using openGL for rendering. Things work fine with Metal. The error logs: "Failed to create IOSurface image (texture)" I used to be able to do something like: class ViewController: UIViewController { @IBOutlet weak var scnView: SCNView! override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. scnView.showsStatistics = true

Confusion About CIContext, OpenGL and Metal (SWIFT). Does CIContext use CPU or GPU by default?

浪子不回头ぞ 提交于 2019-12-06 14:12:27
So I'm making an app where some of the main features revolve around applying CIFilters to images. let context = CIContext() let context = CIContext(eaglContext: EAGLContext(api: .openGLES3)!) let context = CIContext(mtlDevice: MTLCreateSystemDefaultDevice()!) All of these give me about the same CPU usage (70%) on my CameraViewController where I apply filters to frames and update the imageview. All of these seem to work the exact same way which makes me think I am missing some vital piece of information. For example, using AVFoundation I get each frame from the camera apply the filters and

Manually set a 1D Texture in Metal

元气小坏坏 提交于 2019-12-06 13:47:54
问题 I'm trying to fill a 1D texture with values manually and pass that texture to a compute shader (these are 2 pixels that I want to set via code, they don't represent any image). Due to the current small amount of Metal examples, all examples I could find deal with 2D textures that load the texture by converting a loaded UIImage to raw bytes data, but creating a dummy UIImage felt like a hack for me. This is the "naive" way I started with - ... var manualTextureData: [Float] = [ 1.0, 0.0, 0.0,

Convert uintptr_t to id<MTLTexture>

一曲冷凌霜 提交于 2019-12-06 12:40:04
问题 I wanted to create simple iOS plugin which can draw the texture to unity Texture2D. I've done it by CreateExternalTexture() and UpdateExternalTexture(), it's working fine, but I'm curious if I can actually fill the Unity texture straight from iOS side. Here's my code of iOS plugin: // // testTexturePlugin.m // Unity-iPhone // // Created by user on 18/01/16. // // #import <OpenGLES/ES2/gl.h> #import <OpenGLES/ES2/glext.h> #import <UIKit/UIKit.h> #include "UnityMetalSupport.h" #include <stdlib

Metal render with 2 pipelines, the second object overlap the first object

拟墨画扇 提交于 2019-12-06 10:30:19
I have 2 objects, one is with texture, the other one is without texture. I use 2 shaders and I use 2 render pipelines to draw the 2 objects. the 2 object are drawn fine...but when the second obejct is drawn ...it overlaps the first object, so only the second object is on screen... don't know where goes wrong...Maybe because the pipeline is not used right, in the last part of the draw() function. the model draws properly first, but when the sky is drew, the model disappeared...Please help me,,,Thanks Here is my code: class PanoViewController: GameViewController { //sky var depthStencilState: