Forcing hardware accelerated rendering

為{幸葍}努か 提交于 2019-12-14 00:29:55

问题


I have an OpenGL library written in c++ that is used from a C# application using C++/CLI adapters. My problem is that if the application is used on laptops with Nvidia Optimus technology the application will not use the hardware acceleration and fail.

I have tried to use the info found in Nvidias document http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf about linking libs to my C++-dll and exporting NvOptimusEnablement from my OpenGL-library but that fails. I guess I have to do something with the .exe not with the .dlls linked to the .exe

For us it is not a good option to use profiles since we need to ensure that the nvidia hardware is used.

Is there some way a C# application can force Optimus to use the Nvidia chipset instead of the integrated Intel chipset?


回答1:


I tried both options from the swine, but neither worked by themselves. I found I needed to attempt to call the imported function.

using System.Runtime.InteropServices;

class OptimusEnabler
{
    [DllImport("nvapi.dll")]
    public static extern int NvAPI_Initialize();
};

then in my app startup:

try
{
    ///Ignore any System.EntryPointNotFoundException
    ///or System.DllNotFoundException exceptions here
    OptimusEnabler.NvAPI_Initialize();
}
catch
{ }

On an nVidia Optimus system, I am getting a System.EntryPointNotFoundException, but it still works to make the application use the nVidia hardware. Tested on a system with an ATI card, I got a System.DllNotFoundException. Either way, attempting to call this and ignoring any exception here seems to work fine.




回答2:


A working solution. Actually all those already mentioned, but it took me a time to understand how to get it work...

[System.Runtime.InteropServices.DllImport("nvapi64.dll", EntryPoint = "fake")]
static extern int LoadNvApi64();

[System.Runtime.InteropServices.DllImport("nvapi.dll", EntryPoint = "fake")]
static extern int LoadNvApi32();

private void InitializeDedicatedGraphics()
{
    try
    {
        if (Environment.Is64BitProcess)
            LoadNvApi64();
        else
            LoadNvApi32();
    }
    catch { } // will always fail since 'fake' entry point doesn't exists
}

Important - call InitializeDedicatedGraphics() before any window is created




回答3:


If your software fails on Intel, then you won't be able to run it on 50% of the laptops. So I'd suggest fixing this instead.

Than being said, you can perfectly create profiles by code. Just use NvAPI. This code does exactly this, but beware, you probably shouldn't mess with the global profile and create your own instead :

NvAPI_Status status;
// (0) Initialize NVAPI. This must be done first of all
status = NvAPI_Initialize();
if (status != NVAPI_OK) 
    PrintError(status, __LINE__);
// (1) Create the session handle to access driver settings
NvDRSSessionHandle hSession = 0;
status = NvAPI_DRS_CreateSession(&hSession);
if (status != NVAPI_OK) 
    PrintError(status, __LINE__);
// (2) load all the system settings into the session
status = NvAPI_DRS_LoadSettings(hSession);
if (status != NVAPI_OK) 
    PrintError(status, __LINE__);
// (3) Obtain the Base profile. Any setting needs to be inside
// a profile, putting a setting on the Base Profile enforces it
// for all the processes on the system
NvDRSProfileHandle hProfile = 0;
status = NvAPI_DRS_GetBaseProfile(hSession, &hProfile);
if (status != NVAPI_OK) 
    PrintError(status, __LINE__);


NVDRS_SETTING drsSetting1 = {0};
drsSetting1.version = NVDRS_SETTING_VER;
drsSetting1.settingId = SHIM_MCCOMPAT_ID;
drsSetting1.settingType = NVDRS_DWORD_TYPE;

NVDRS_SETTING drsSetting2 = {0};
drsSetting2.version = NVDRS_SETTING_VER;
drsSetting2.settingId = SHIM_RENDERING_MODE_ID;
drsSetting2.settingType = NVDRS_DWORD_TYPE;

NVDRS_SETTING drsSetting3 = {0};
drsSetting3.version = NVDRS_SETTING_VER;
drsSetting3.settingId = SHIM_RENDERING_OPTIONS_ID;
drsSetting3.settingType = NVDRS_DWORD_TYPE;

if( ForceIntegrated ){
    drsSetting1.u32CurrentValue = SHIM_MCCOMPAT_INTEGRATED;
    drsSetting2.u32CurrentValue = SHIM_RENDERING_MODE_INTEGRATED;
    drsSetting3.u32CurrentValue = SHIM_RENDERING_OPTIONS_DEFAULT_RENDERING_MODE | SHIM_RENDERING_OPTIONS_IGPU_TRANSCODING;
}else{
    drsSetting1.u32CurrentValue = SHIM_MCCOMPAT_ENABLE;
    drsSetting2.u32CurrentValue = SHIM_RENDERING_MODE_ENABLE;
    drsSetting3.u32CurrentValue = SHIM_RENDERING_OPTIONS_DEFAULT_RENDERING_MODE;
}



status = NvAPI_DRS_SetSetting(hSession, hProfile, &drsSetting1);
if (status != NVAPI_OK) 
    PrintError(status, __LINE__);

status = NvAPI_DRS_SetSetting(hSession, hProfile, &drsSetting2);
if (status != NVAPI_OK) 
    PrintError(status, __LINE__);

status = NvAPI_DRS_SetSetting(hSession, hProfile, &drsSetting3);
if (status != NVAPI_OK) 
    PrintError(status, __LINE__);

// (5) Now we apply (or save) our changes to the system
status = NvAPI_DRS_SaveSettings(hSession);
if (status != NVAPI_OK) 
    PrintError(status, __LINE__);
// (6) We clean up. This is analogous to doing a free()
NvAPI_DRS_DestroySession(hSession);
hSession = 0;

At startup, test if your profile exists. If not, create it (and you'll probably have to restart yourself too). NvAPI is a static lib, and will gracefully return an error code on non-NVIDIA hardware, so you can ship with it safely.

EDIT : Looks like there's an easier way. From GLFW 3 source code :

// Applications exporting this symbol with this value will be automatically
// directed to the high-performance GPU on nVidia Optimus systems
//
GLFWAPI DWORD NvOptimusEnablement = 0x00000001;



回答4:


From the document it seems to be rather simple. You are given multiple options how to do that. Unfortunately, the exe needs to do that, not the dll. According to this tutorial, it might be possible to do something like:

class OptimusEnabler {
    [DllExport("NvOptimusEnablement")]
    public static int NvOptimusEnablement = 1;
};

This then needs to be included in your C++ library interface so that any C# application that uses it would be forced to export this. Alternately, you can try linking against nvapi.dll:

class OptimusEnabler {
    [DllImport("nvapi.dll")]
    public static extern int NvAPI_Initialize();
};

According to the document, this should also be enough to recognize your application as NV-enabled. The imported function should not even be required to be called.



来源:https://stackoverflow.com/questions/17270429/forcing-hardware-accelerated-rendering

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!