DirectX 11: Model not rendering from Model class

好久不见. 提交于 2019-12-11 18:08:01

问题


Ok! Here goes. I've updated my code. However, after hours of debugging seemingly perfect code, I can't spot the problem. I've set up multiple breakpoints around the Vertex and Index buffer creation, and the class draw call.

I've created a temporary vtest struct for the purposes of testing. It carries the definition.

struct vtest{
    XMFLOAT3 Vertex;
    XMFLOAT4 Color;
};

Linked to a proper IA abstraction:

D3D11_INPUT_ELEMENT_DESC InputElementDesc[] = {
        { "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
        { "COLOR", 0, DXGI_FORMAT_R32G32B32A32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0 },  
};

(All of which returns a (HRESULT) S_OK)

D3D11_BUFFER_DESC BufferDescription;
ZeroMemory(&BufferDescription, sizeof(BufferDescription));

BufferDescription.Usage = D3D11_USAGE_DYNAMIC;
BufferDescription.ByteWidth = sizeof(vtest) * sz_vBuffer;
BufferDescription.BindFlags = D3D11_BIND_VERTEX_BUFFER;
BufferDescription.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
BufferDescription.MiscFlags = 0;

D3D11_SUBRESOURCE_DATA SRData;
ZeroMemory(&SRData, sizeof(SRData));
SRData.pSysMem = test;
SRData.SysMemPitch = 0;
SRData.SysMemSlicePitch = 0;

hr = Device->CreateBuffer(&BufferDescription, &SRData, &g_vBuffer);
D3D11_MAPPED_SUBRESOURCE MappedResource;
ZeroMemory(&MappedResource, sizeof(MappedResource));

The vtest struct fills proper, and:

DeviceContext->Map(g_vBuffer, NULL, D3D11_MAP_WRITE_DISCARD, NULL, &MappedResource);

Succeeds, also with (HRESULT) S_OK.

Indices initialized as such:(One-dimensional DWORD array of indices.)

D3D11_BUFFER_DESC iBufferDescription;
ZeroMemory(&iBufferDescription, sizeof(iBufferDescription));

iBufferDescription.Usage = D3D11_USAGE_DEFAULT;
iBufferDescription.ByteWidth = sizeof(DWORD)*sz_iBuffer;
iBufferDescription.BindFlags = D3D11_BIND_INDEX_BUFFER;
iBufferDescription.CPUAccessFlags = NULL;
iBufferDescription.MiscFlags = 0;

D3D11_SUBRESOURCE_DATA iSRData;
iSRData.pSysMem = Indices;

hr = direct3D.Device->CreateBuffer(&iBufferDescription, &iSRData, &g_iBuffer);

The IA Set... calls are in the draw() call:

DeviceContext->IASetVertexBuffers(0, 1, &g_vBuffer, &stride, &Offset);
DeviceContext->IASetIndexBuffer(g_iBuffer, DXGI_FORMAT_R32_UINT, 0);

Other settings: (Edit: Corrected values to show configuration.)

D3D11_RASTERIZER_DESC DrawStyleState;
DrawStyleState.AntialiasedLineEnable = false;
DrawStyleState.CullMode = D3D11_CULL_NONE;
DrawStyleState.DepthBias = 0;
DrawStyleState.FillMode = D3D11_FILL_SOLID;
DrawStyleState.DepthClipEnable = false;
DrawStyleState.MultisampleEnable = true;
DrawStyleState.FrontCounterClockwise = false;
DrawStyleState.ScissorEnable = false;

My Depth Stencil code.

D3D11_TEXTURE2D_DESC DepthStenDescription;
ZeroMemory(&DepthStenDescription, sizeof(D3D11_TEXTURE2D_DESC));

DepthStenDescription.Width = cWidth;
DepthStenDescription.Height = cHeight;
DepthStenDescription.MipLevels = 0;
DepthStenDescription.ArraySize = 1;
DepthStenDescription.Format = DXGI_FORMAT_D24_UNORM_S8_UINT;
DepthStenDescription.SampleDesc.Count = 1;
DepthStenDescription.SampleDesc.Quality = 0;
DepthStenDescription.Usage = D3D11_USAGE_DEFAULT;
DepthStenDescription.BindFlags = D3D11_BIND_DEPTH_STENCIL;
DepthStenDescription.CPUAccessFlags = 0;
DepthStenDescription.MiscFlags = 0;

D3D11_DEPTH_STENCIL_VIEW_DESC DSVDesc;
ZeroMemory(&DSVDesc, sizeof(D3D11_DEPTH_STENCIL_VIEW_DESC));
DSVDesc.Format = DSVDesc.Format;
DSVDesc.ViewDimension = D3D11_DSV_DIMENSION_TEXTURE2D;
DSVDesc.Texture2D.MipSlice = 0;

And finally, my entity class draw() method:

void    Entity::Draw(){
    UINT stride = sizeof(vtest);
    UINT Offset = 0;

    ObjectSpace = XMMatrixIdentity();
    m_Scale = Scale();
    m_Rotation = Rotate();
    m_Translate = Translate();

    ObjectSpace = m_Scale*m_Rotation*m_Translate;
    mWVP = ObjectSpace*direct3D.mView*direct3D.mProjection;

    LocalWorld.mWorldVP = XMMatrixTranspose(wWVP);

    DeviceContext->UpdateSubresource(direct3D.MatrixBuffer, 0, NULL, &LocalWorld, 0, 0);
    DeviceContext->VSSetConstantBuffers(0, 1, &direct3D.MatrixBuffer);

    DeviceContext->IASetVertexBuffers(0, 1, &g_vBuffer, &stride, &Offset);
    DeviceContext->IASetIndexBuffer(g_iBuffer, DXGI_FORMAT_R32_UINT, 0);
    DeviceContext->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);

    DeviceContext->DrawIndexed(e_Asset.sz_Index, 0, 0);
}

The code compiles, and the backbuffer presents correctly, but no model.

Initialization of DirectX functions seem to be fine too...

Update From Banex's suggestion, using the Visual Studio DirectX Debugging tools yield that I may have gone wrong in my .hlsl file.

I think also I may have gone wrong at shader initialization, since my shader really is simple, and really works as a vert/pix pass-through file:


回答1:


After examining the .hlsl file and doing further debugging; setting output.position = position; rather than being multiplied by the world matrix, the model was drawn on screen, implying a bad matrix calculation causing an extreme warp, or null values, stored in the constant buffer.

cbuffer ConstantBuffer:register(b0)
{
    float4x4 WVP;
}

struct VOut
{
    float4 position : SV_POSITION;
    float4 color : COLOR;
};

VOut VShader(float4 position : POSITION, float4 color : COLOR)
{
    VOut output;
    output.position = position;// mul(position, WVP);
    output.color = color;

    return output;
}


float4 PShader(float4 position : SV_POSITION, float4 color : COLOR) : SV_TARGET
{
    return color;
}


来源:https://stackoverflow.com/questions/26565752/directx-11-model-not-rendering-from-model-class

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!