directx-11

Mediafoundation cant decode video on certain NVIDIA cards

时光总嘲笑我的痴心妄想 提交于 2021-02-20 04:36:26
问题 We are using the SharpDX .NET wrapper to make Mediafoundation decode an MP4 Video into a DirectX9 texture. This works fine, except that it crashes on certain NVIDIA cards for example the 940MX . For example the same machine with an Intel HD graphics card decodes the video stream fine. Unfortunately we cant get any other details than a E_FAIL/Unspecified error from SharpDX . Even with the debug layer enabled. Any ideas how to find out why it crashes? VideoMediaType outputVideoFormat;

Simplest way to draw line in DirectX 11 (C++)?

假如想象 提交于 2021-02-18 18:15:15
问题 I want to draw a line in my DirectX 11 application. I want it to have constant width (not depending on distance from camera), but it has to be a line in space (3D) , so something like lines of objects in wireframe mode . I will render my line in a scene full of other objects with some shaders. What would be the best and simplest way to achive it in DirectX 11 with C++ (not C#)? Code sample will be appreciated ;) 回答1: The most common solution would be to use D3D11_PRIMITIVE_TOPOLOGY_LINELIST

CreateTexture2D and CreateDepthStencilView failing

↘锁芯ラ 提交于 2021-02-11 09:50:59
问题 I am trying to learn Directx 11.0. I have encountered an error. This is the code. // include the basic windows header files and the Direct3D header files #include <windows.h> #include <windowsx.h> #include <d3d11.h> #include <d3dx11.h> #include <d3dx10.h> // include the Direct3D Library file #pragma comment (lib, "d3d11.lib") #pragma comment (lib, "d3dx11.lib") #pragma comment (lib, "d3dx10.lib") // define the screen resolution #define SCREEN_WIDTH 800 #define SCREEN_HEIGHT 600 // global

Failing to properly initialize a 2D texture from memory in Direct3D 11

白昼怎懂夜的黑 提交于 2021-02-10 15:28:13
问题 I am trying to produce a simple array in system memory that represent a R8G8B8A8 texture and than transfer that texture to the GPU memory. First, I allocate an array and fill it with the desired green color data: frame.width = 3; frame.height = 1; auto components = 4; auto length = components * frame.width * frame.height; frame.data = new uint8_t[length]; frame.data[0 + 0 * frame.width] = 0; frame.data[1 + 0 * frame.width] = 255; frame.data[2 + 0 * frame.width] = 0; frame.data[3 + 0 * frame

Can't set the output type on an HEVC decoder IMFTransform

微笑、不失礼 提交于 2021-02-10 05:30:50
问题 I've written this program to setup an HEVC decoder based on https://docs.microsoft.com/en-us/windows/win32/medfound/supporting-direct3d-11-video-decoding-in-media-foundation. Everything works fine until the end when I call result = decoder->SetOutputType(0, media_type, 0); this returns the error MF_E_ATTRIBUTENOTFOUND . I'm not sure what's wrong, this error isn't described in the SetOutputType documentation and I've only found a couple examples of HEVC decoding with MF and none of them

d3dx11.lib not found?

穿精又带淫゛_ 提交于 2021-02-07 06:18:05
问题 I'm using Windows 8 / Visual Studio 2012, C++11 and Direct3D 11 for development. I include the Direct3D libraries like this #pragma comment(lib, "dxgi.lib") #pragma comment(lib, "d3d11.lib") #pragma comment(lib, "d3dx11.lib") // <-- error LNK1104: cannot open file 'd3dx11.lib' #pragma comment(lib, "d3dx10.lib") However, the linker can't seem to find the d3dx11.lib . After adding the path where the library is located to the 'Library directories' of the project, the linker still can't find

DirectX 9 to 11 OpenSharedResource is leaking memory like crazy. Am I doing something wrong?

北城以北 提交于 2021-01-29 06:17:53
问题 I've been fighting a memory leak in my software, where the virtual address space of my application is slowly used up by shared memory. Based on the amount of memory leaked, it was very clearly in the form of texture objects. I've isolated the bug to the following code sample. I created a share-able DX9 texture object, I open it from a D3D11 device, and then I release it. In this sample, running on my NVIDIA GeForce 780 Ti on Windows 8.1, my 32-bit process very quickly runs out of VAS as these

CreateBuffer throwing an “Access violation reading location”

二次信任 提交于 2021-01-28 07:40:07
问题 I have a function, inside a class called ModelClass, that does the following: bool ModelClass::SetVertices(ID3D11Device* device, VertexType* vertices) { // Error catching variable HRESULT result; // Setup the vertex buffer description D3D11_BUFFER_DESC vertexBufferDesc; ZeroMemory(&vertexBufferDesc, sizeof(vertexBufferDesc)); vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT; vertexBufferDesc.ByteWidth = sizeof(vertices)*24; vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc

CreateBuffer throwing an “Access violation reading location”

早过忘川 提交于 2021-01-28 07:26:34
问题 I have a function, inside a class called ModelClass, that does the following: bool ModelClass::SetVertices(ID3D11Device* device, VertexType* vertices) { // Error catching variable HRESULT result; // Setup the vertex buffer description D3D11_BUFFER_DESC vertexBufferDesc; ZeroMemory(&vertexBufferDesc, sizeof(vertexBufferDesc)); vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT; vertexBufferDesc.ByteWidth = sizeof(vertices)*24; vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc

(DirectX 11) Dynamic Vertex/Index Buffers implementation with constant scene content changes

回眸只為那壹抹淺笑 提交于 2021-01-27 23:39:43
问题 Been delving into un-managed DirectX 11 for the first time (bear with me) and there's an issue that, although asked several times over the forums still leaves me with questions. I am developing as app in which objects are added to the scene over time. On each render loop I want to collect all vertices in the scene and render them reusing a single vertex and index buffer for performance and best practice. My question is regarding the usage of dynamic vertex and index buffers. I haven't been