Does CG 3.0 leak?

谁说我不能喝 提交于 2019-12-10 15:45:02

问题


I'm finding CG appears to have a memory leak. I submitted a report via nvidia.com, but if you try this here:

If you remove the line that says

cgD3D11SetTextureParameter( g.theTexture, g.sharedTex ) ;

The leak stops.

Does CG 3.0 really leak?

Using ATI Radeon 5850 GPU / Windows 7 64-bit.


回答1:


Yes, it leaks. Internally it creates a ShaderResourceView on every call, and never releases it. I think the API is ill-designed, they should have taken a ShaderResourceView* as a parameter to this function, instead of just a Resource*.

I posted about this on nvidia forums about 6 months ago and never got a response

Is your report posted publicly? Or some kind of private support ticket?




回答2:


Yes, Cg 3.0 leaks every time you call cgD3D11SetTextureParameter(), causing your application's memory usage to climb. Unfortunately it makes Cg 3.0 with D3D11 completely unusable. One symptom of this is that, after a while of your application running, it will stop rendering and the screen will just go black. I wasted a lot of time trying to determine the cause of this before discovering the Cg bug.

If anybody is wondering why this isn't apparent with the Cg D3D11 demos, its because the few that actually use textures are so simple that they can get away with only calling cgD3D11SetTextureParameter() once at the start.

This same bug remains with Cg Toolkit 3.1 (April 2012).




回答3:


jmp [UPDATE] ;; skip obsolete text segment

Could it be that Cg is being destroyed after d3d so it doesn't release the reference on time? Or vice-versa? such as the function acquiring the texture but not releasing it before d3d closes, because when you set a texture to a shader, the texture is acquired until shader resources are released somehow. You are destroying the d3d context, here: SAFE_RELEASE( g.d3d ); SAFE_RELEASE( g.gpu ); Later on, you free the shader, as follows CleanupCg(): cgDestroyProgram( g.v_vncShader ); checkForCgError( "destroying vertex program" ); cgDestroyProgram( g.px_vncShader ); checkForCgError( "destroying fragment program" ); Try to change the order of the calls in a way you first release all resources from both cg and d3d, this: cgD3D11SetDevice( g.cgContext, NULL ); should also be called before releasing the d3d context, just in case.

UPDATE:

This should be different inside WinMain():

initD3D11() ;  // << FIRST you init D3D
initCg() ;     // << SECOND you init CG with the D3D pointers

initD2D1() ;   // 
initVBs() ;

// Main message loop    
while( WM_QUIT != msg.message ){ /* loop code */ }

CleanupDevice();   //// << FIRST you release all D3D, when Cg is still referencing it (why?). 
CleanupCg();       //// << SECOND if something in the Cg runtime depend on d3dcontext which you just destroyed, it will crash or leak or do whatever it wants

so you should swap them to ensure Cg to free any d3d pointer:

CleanupCg();       //// << FIRST release Cg to ensure it's not referencing D3D anymore. 
CleanupDevice();   //// << SECOND D3D isn't either referencing or being referenced by Cg, so just release it all

You could also provide the debugger output and other info as I asked down there, because you're basically saying "Cg seems to be broken, this is the whole code, look the line ###, is it broken?" but there are more than a thousand lines (1012) of C, C++ and shader code in your file, you basically provide no info but readily point to a Cg bug (based on... what?) which of course, if you're so sure, why would anyone look at the code if the code is fine? Which isn't by the way, not that I don't like it but... it got these little things such as the call ordering which are silly mistakes but that can make debugging a real hell, it's a clear bug, and I may also think that if I just looked into Main and found a bug, well there is a long way up to the render call and the Cg implementation, isn't it? I can't run the app on WinXP, but these errors are in the most predictable places :)

So... when your code is clean of any bug... ohh! look! what I've just found..

~VertexBuffer()
{
  SAFE_RELEASE( vb );
  SAFE_RELEASE( layout ) ;
}

turns out in VertexBuffer constructor you call iD3D->GetImmediateContext( &gpu ); and store the pointer in a private member, so... shouldn't you add:

SAFE_RELEASE( gpu ); // ? there are 3 VertexBuffers instances, so that's another memory leak.

Ok so there are some things you should fix in your code that cause memory leaks, and I just took a look on it, so you didn't really try. On the other hand, it seems your code is clear and full of explanations and I need to learn some DX11, so actually I should thank you for it. The downvote was somewhat rude though :P specially because I'm probably right, and other people would avoid reading your code as soon as the page displays.



来源:https://stackoverflow.com/questions/7459040/does-cg-3-0-leak

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!