directx-11

How to set RenderState in DirectX11?

笑着哭i 提交于 2019-12-07 21:50:52
问题 How Do I set a renderState in DirectX11? device->SetRenderState() doesn't seem to exist anymore. 回答1: There are multiple render states in DirectX11: Blend State - http://msdn.microsoft.com/en-us/library/ff476349.aspx Depth Stencil State - http://msdn.microsoft.com/en-us/library/ff476375.aspx Rasterizer State - http://msdn.microsoft.com/en-us/library/ff476580.aspx Sampler State - http://msdn.microsoft.com/en-us/library/ff476588.aspx Which one do you need? 回答2: These enumerations may be what

Specifying the target layer of a 3D rendertarget in vertex shader? [HLSL]

亡梦爱人 提交于 2019-12-07 14:47:52
问题 When working in HLSL/Directx11 I see there are two methods for binding a 3D rendertarget: either you bind the entire target or you bind it while specifying a layer. If you bind the entire target how does one specify the layer in HLSL code to which the output color is applied? I have a suspicion this requires a geometry shader ... is that correct? Is there any other approach which would allow this to be done in the vertex shader or elsewhere? 回答1: If you bind your whole volume texture (or

Directx 11, send multiple textures to shader

吃可爱长大的小学妹 提交于 2019-12-07 02:01:15
问题 using this code I can send one texture to the shader: devcon->PSSetShaderResources(0, 1, &pTexture); Of course i made the pTexture by: D3DX11CreateShaderResourceViewFromFile Shader: Texture2D Texture; return color * Texture.Sample(ss, texcoord); I'm currently only sending one texture to the shader, but I would like to send multiple textures, how is this possible? Thank You. 回答1: By using Texture Arrays. When you fill out your D3D11_TEXTURE2D_DESC look at the ArraySize member. This desc struct

How can I migrate between versions?

岁酱吖の 提交于 2019-12-06 16:01:44
What are changes from directx 10 to 11? Ive written some code in directx 10 and I want to change it to directx 11. Is this just about quality and I can do it just by changing headrs and dll files or functions and way of coding have been changed? ing. First, I need to say, that nothing will change if you just change your D3D10DoSomething() functions to D3D11DoSomething() . They will do same things. No passive gain. You must use new features explicitly to make your app better. Those features that D3D10 don't have: such as hardware tessellation, compute shader, many many other stuff. To code. So,

DirectX11 CreateWICTextureFromMemory Using PNG

北城余情 提交于 2019-12-06 09:20:51
I've currently got textures loading using CreateWICTextureFromFile however I'd like a little more control over it, and I'd like to store images in their byte form in a resource loader. Below is just two sets of test code that return two separate results and I'm looking for any insight into a possible solution. ID3D11ShaderResourceView* srv; std::basic_ifstream<unsigned char> file("image.png", std::ios::binary); file.seekg(0,std::ios::end); int length = file.tellg(); file.seekg(0,std::ios::beg); unsigned char* buffer = new unsigned char[length]; file.read(&buffer[0],length); file.close();

Copy ffmpeg d3dva texture resource to shared rendering texture

隐身守侯 提交于 2019-12-06 08:55:54
问题 I'm using ffmpeg to decode video via d3dva based on this example https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/hw_decode.c. I'm able to succesfully decode video. What I need to do next is to render decoded NV12 frame. I have created directx rendering texture based on this example https://github.com/balapradeepswork/D3D11NV12Rendering and set it as shared. D3D11_TEXTURE2D_DESC texDesc; texDesc.Format = DXGI_FORMAT_NV12; // Pixel format texDesc.Width = width; // Width of the video

Load texture in directX 11.1

泄露秘密 提交于 2019-12-06 07:59:45
问题 I'm reading http://www.braynzarsoft.net/ tutorials for DX11 but I mainly learning programming on DX11.1 with metro style app. as I continue to learn I find out some features in Dx11 are not anymore in DX11.1,like D3DX11CreateShaderResourceViewFromFile that tutorial used to load a texture but in DX11.1 we doesn't have this! my question is how can I load a DDS texture in DX11.1 ? I want to replace that function in this Code so that I can load a DDS texture: hr =

DX11 convert pixel format BGRA to RGBA

女生的网名这么多〃 提交于 2019-12-06 07:19:58
问题 I have currently the problem that a library creates a DX11 texture with BGRA pixel format. But the displaying library can only display RGBA correctly. (This means the colors are swapped in the rendered image) After looking around I found a simple for-loop to solve the problem, but the performance is not very good and scales bad with higher resolutions. I'm new to DirectX and maybe I just missed a simple function to do the converting. // Get the image data unsigned char* pDest = view->image-

DirectX::XMMATRIX __declspec(align('16')) won't be aligned

折月煮酒 提交于 2019-12-06 06:48:27
问题 I am using DirectXMath in building my 3D simulation project void SetConstantBuffer(ID3D11DeviceContext*_device_context, DirectX::XMMATRIX _world, DirectX::XMMATRIX _view, DirectX::XMMATRIX _projection) { ConstantBuffer const_buffer; const_buffer.View = DirectX::XMMatrixTranspose(_world); const_buffer.World = DirectX::XMMatrixTranspose(_view); const_buffer.Projection = DirectX::XMMatrixTranspose(_projection); _device_context->UpdateSubresource(m_const_buffer, 0, NULL, &const_buffer, 0, 0); } I

DirectX11 Desktop duplication not working with NVIDIA

跟風遠走 提交于 2019-12-06 05:59:13
问题 I'm trying too use DirectX desktop duplication API. I tried running exmaples from http://www.codeproject.com/Tips/1116253/Desktop-Screen-Capture-on-Windows-via-Windows-Desk And from https://code.msdn.microsoft.com/windowsdesktop/Desktop-Duplication-Sample-da4c696a Both of these are examples of screen capture using DXGI. I have NVIDIA GeForce GTX 1060 with Windows 10 Pro on the machine. It has Intel™ Core i7-6700HQ processor. These examples work perfectly fine when NVIDIA Control Panel > 3D