directx

Creating a texture from a image in DX 12 VC++

给你一囗甜甜゛ 提交于 2019-12-13 08:52:44
问题 I just want know Directx 12 API to create texture from image. For DX11 it is D3DX11CreateShaderResourceViewFromFile and for DX9 it is D3DXCreateTextureFromFileEx and for DX12 ? 回答1: Things are a little bit better by now. Microsoft rewrote their DDSTextureLoader for DX12 and released it as part of their MiniEngine on GitHub https://github.com/Microsoft/DirectX-Graphics-Samples/blob/master/MiniEngine/Core/DDSTextureLoader.cpp You also may want to take a look at my derivative work that is

laptop dual video cards - how to programatically detect and/or choose which one is used

折月煮酒 提交于 2019-12-13 06:40:08
问题 We're developing software which uses DirectX for 3D rendering on Windows 7 and later machines, 64-bit C#/.NET code. We've observed that a number of newer Dell laptops we're testing on have dual video cards. They have the Intel HD 4600 integrated graphics and they also have a faster NVIDIA Quadro card (for example). By default, out of the box, the Intel graphics are used by the DirectX application. This is done, presumably to preserve battery life. But the performance is noticeably worse than

Precise Texture Overlay

穿精又带淫゛_ 提交于 2019-12-13 06:03:44
问题 I'm trying to set up a two-stage render of objects in a 3D engine I'm working on written in C++ with DirectX9 to facilitate transparency (and other things). I thought it was all working nicely until I noticed some dodgyness on the edge of objects rendered before objects using this two stage method. The two stage method is simple: Draw model to off-screen ("side") texture of same size using same zbuffer (no MSAA is used anywhere) Draw off-screen ("side") texture over the top of the main render

Converting DirectX11 ID3D11Texture2D from Shader into OpenCV IplImage

北慕城南 提交于 2019-12-13 05:31:57
问题 Short introduction : I have written an Augmented Reality Application with the Oculus Rift in C++ (DirectX). One of my fragment shaders computes the undistortion for a omnidirectional camera model. The only Problem I have now is to read out the rendered undistortion texture2D and convert it further for 3DPose Tracking/Mapping of the real world using OpenCV. The funny Thing is, I have already done the other way around, which means I have already created shader resource views with my distorted

Screen Capture of DirectX programs with Java

时间秒杀一切 提交于 2019-12-13 05:24:33
问题 I wrote a Java application that uses the java.awt.Robot to take screen captures of the screen and was wondering if capturing a program using DirectX/OpenGL would be possible? Each time I try to do this, I get a black screen 回答1: Yes it is possible. But maybe only in limited circiumstances. I've successfully captured the contents on OpenGL (jogl) windows on linux and windows using the Robot createScreenCapture. Some specific information about the implementation that may be different for you:

Resizing and position a SharpDX sprite

两盒软妹~` 提交于 2019-12-13 05:19:40
问题 I'm trying to resize a DirectX Texture and place it in the top right corner of the window. I am drawing the texture using a sprite, here's is my code (I am using SharpDX): albumArtSprite.Begin(); NativeMethods.RECT rect; NativeMethods.GetClientRect(device.CreationParameters.HFocusWindow, out rect); float targetDimensions = 150f; var matrix = Matrix.Scaling(targetDimensions / albumArtInformation.Width, targetDimensions / albumArtInformation.Height, 0f) * Matrix.Translation(rect.Width -

IDirect3DDevice9::GetFrontBufferData fails with segmentation fault

只谈情不闲聊 提交于 2019-12-13 04:24:29
问题 I created a really simple DirectX program to capture the screen. It works well on my machine, but fails on another machine in the following line with a segmentation fault (SIGSEV): g_pd3dDevice->GetFrontBufferData(0, g_pSurface); The following function is used to initialize DirectX: HRESULT InitD3D(HWND hWnd) { D3DDISPLAYMODE ddm; D3DPRESENT_PARAMETERS d3dpp; if((g_pD3D=Direct3DCreate9(D3D_SDK_VERSION))==NULL) { ErrorMessage("Unable to Create Direct3D "); return E_FAIL; } if(FAILED(g_pD3D-

How do I render a fullscreen frame with a different resolution than my display?

蓝咒 提交于 2019-12-13 04:22:12
问题 I am in the process of teaching myself DirectX development while going through all the tutorials on http://directxtutorial.com/. I have run into a problem that I believe is related to me having multiple monitors. My display is 1920 x 1080. When my code runs in its current state my primary monitor becomes black with the cursor in the middle. When I set my SCREEN defines at the top of my code to 1920 x 1080 my program runs fine and displays my sprite with the navy background. When I leave the

Directx 11 depth test not working

断了今生、忘了曾经 提交于 2019-12-13 03:22:02
问题 I cannot get my program to correctly choose which models to place in front. I have followed the MSDN code exactly. My code appears to correctly draw all polygons in a particular call of DrawIndexed, but each subsequent call seems to cause models to be drawn in the order they are drawn, not based on whether they are closer to the screen. Here is my code for initializing Direct3d: DXGI_SWAP_CHAIN_DESC sd; ZeroMemory( &sd, sizeof( sd ) ); sd.BufferCount = 1; sd.BufferDesc.Width = width; sd

DirectX compute shader: how to write a function with variable array size argument?

纵然是瞬间 提交于 2019-12-13 03:07:58
问题 I'm trying to write a function within a compute shader (HLSL) that accept an argument being an array on different size. The compiler always reject it. Example (not working!): void TestFunc(in uint SA[]) { int K; for (K = 0; SA[K] != 0; K++) { // Some code using SA array } } [numthreads(1, 1, 1)] void CSMain( uint S1[] = {1, 2, 3, 4 }; // Compiler happy and discover the array size uint S2[] = {10, 20}; // Compiler happy and discover the array size TestFunc(S1); TestFunc(S2); } If I give an