direct3d

directx texture dimensions

余生颓废 提交于 2019-12-22 09:37:41
问题 so I've discovered that my graphics card automatically resizes textures to powers of 2, which isn't usually a problem but I need to render only a portion of my texture and in doing so, must have the dimensions it has been resized to... ex: I load a picture that is 370x300 pixels into my texture and try to draw it with a specific source rectangle RECT test; test.left = 0; test.top = 0; test.right = 370; test.bottom = 300; lpSpriteHandler->Draw( lpTexture, &test, // srcRect NULL, // center NULL

How can I improve performance of Direct3D when I'm writing to a single vertex buffer thousands of times per frame?

早过忘川 提交于 2019-12-22 06:09:46
问题 I am trying to write an OpenGL wrapper that will allow me to use all of my existing graphics code (written for OpenGL) and will route the OpenGL calls to Direct3D equivalents. This has worked surprisingly well so far, except performance is turning out to be quite a problem. Now, I admit I am most likely using D3D in a way it was never designed. I am updating a single vertex buffer thousands of times per render loop. Every time I draw a "sprite" I send 4 vertices to the GPU with texture

How can I improve performance of Direct3D when I'm writing to a single vertex buffer thousands of times per frame?

Deadly 提交于 2019-12-22 06:09:01
问题 I am trying to write an OpenGL wrapper that will allow me to use all of my existing graphics code (written for OpenGL) and will route the OpenGL calls to Direct3D equivalents. This has worked surprisingly well so far, except performance is turning out to be quite a problem. Now, I admit I am most likely using D3D in a way it was never designed. I am updating a single vertex buffer thousands of times per render loop. Every time I draw a "sprite" I send 4 vertices to the GPU with texture

How do I use a Direct3D 11 pointer wrapped in ComPtr to get a 11.1 interface?

大城市里の小女人 提交于 2019-12-22 05:01:16
问题 I'm following tutorials and I've converted the usual initialisation to using ComPtrs up to this line: ID3D11Device* g_pd3dDevice = nullptr; ID3D11Device1* g_pd3dDevice1 = nullptr; // Obtain the Direct3D 11.1 versions if available hr = g_pd3dDevice->QueryInterface( __uuidof( ID3D11Device1 ), reinterpret_cast<void**>( &g_pd3dDevice1 ) ); Here's what I expected to be the straight analog: Microsoft::WRL::ComPtr<ID3D11Device> device = nullptr; Microsoft::WRL::ComPtr<ID3D11Device1> device1 =

Looking for a faster-than-GDI solution for rendering dynamic data plots

流过昼夜 提交于 2019-12-21 22:34:23
问题 I've written a simple GDI-based data plotter using C++/CLI but it's not particularly fast (some basic profiling indicates it's the rendering to screen that's the problem). Is there any way to enable hardware acceleration for a UserControl or is there a .net interface for direct3D? ...or are there some other options I could consider. We're using managed code so the solution really needs to be CLI compatible if at all possible. [Edit] In case it helps, I'm rending strips (128 data points) of

How do I improve Direct3D streaming texture performance?

心不动则不痛 提交于 2019-12-21 19:40:54
问题 I'm trying to accelerate the drawing of a full-screen texture which changes every frame. On my system, I can get around 1000 FPS using GDI and BitBlt(), but I thought I could improve performance by using Direct3D and dynamic textures. Instead I'm only getting around 250 FPS. I'm running on a Mac Pro with an ATI HD 4870 with current drivers. I've tried using dynamic textures and that gives me a small gain (~15FPS) and I've tried using a texture chain to avoid pipeline stalls and that has no

How to get current display mode (resolution, refresh rate) of a monitor/output in DXGI?

[亡魂溺海] 提交于 2019-12-21 03:42:22
问题 I am creating a multi-monitor full screen DXGI/D3D application. I am enumerating through the available outputs and adapters in preparation of creating their swap chains. When creating my swap chain using DXGI's IDXGIFactory::CreateSwapChain method, I need to provide a swap chain description which includes a buffer description of type DXGI_MODE_DESC that details the width, height, refresh rate, etc. How can I find out what the output is currently set to (or how can I find out what the display

Is it possible to use OpenGL ES code with a WPF application via a D3DImage and ANGLE?

大憨熊 提交于 2019-12-20 10:05:28
问题 Summary (TL:DR version) Ultimately our goal is to be able to utilize OpenGL ES code in a WPF application natively (i.e. not SharpGL, etc.) and without Airspace or driver issues, possible using Google's ANGLE project. Background: One of the things I like about OpenGL over DirectX is its cross-platform capability. It has excellent support on both OS X and Linux, and also on Android and iOS via ES. However, on Windows, using it is marred with driver issues, or worse, a lot of cards simply don't

Why Direct3D application performs better in full screen mode?

醉酒当歌 提交于 2019-12-20 09:15:20
问题 The performance of a Direct3D application seems to be significantly better in full screen mode compared to windowed mode. What are the technical reasons behind this? I guess it has something to do with the fact that a full screen application can gain exclusive control for the display. But why the application cannot gain exclusive control for part of the screen (i.e. window) and have the same performance benefits? 回答1: Here are the cliff notes on how things work underneath. Monitor screen

Use D3D11 debug layer with VS2013 on Windows 10

别等时光非礼了梦想. 提交于 2019-12-20 08:42:22
问题 In my D3D 11 projects, I always add #if (defined(DEBUG) || defined(_DEBUG)) deviceFlags |= D3D11_CREATE_DEVICE_DEBUG; #endif /* (defined(DEBUG) || defined(_DEBUG)) */ to the device creation flags to enable debug output. Since I upgraded to Windows 10, this does not work any more. The device creation fails with the following output: D3D11CreateDevice: Flags (0x2) were specified which require the D3D11 SDK Layers for Windows 10, but they are not present on the system. These flags must be