optimus

How to start debug version of project in nsight with optirun command?

ε祈祈猫儿з 提交于 2019-12-18 09:28:30
问题 I'we been writing some simple cuda program (I'm student so I need to practice), and the thing is I can compile it with nvcc from terminal (using Kubuntu 12.04LTS) and then execute it with optirun ./a.out (hardver is geforce gt 525m on dell inspiron) and everything works fine. The major problem is that I can't do anything from Nsight. When I try to start debug version of code the message is "Launch failed! Binaries not found!". I think it's about running command with optirun but I'm not sure.

Forcing hardware accelerated rendering

為{幸葍}努か 提交于 2019-12-14 00:29:55
问题 I have an OpenGL library written in c++ that is used from a C# application using C++/CLI adapters. My problem is that if the application is used on laptops with Nvidia Optimus technology the application will not use the hardware acceleration and fail. I have tried to use the info found in Nvidias document http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf about linking libs to my C++-dll and exporting NvOptimusEnablement from my OpenGL

“The launch timed out and was terminated” error with Bumblebee on Linux

落爺英雄遲暮 提交于 2019-12-08 11:48:18
问题 When running a long kernel (especially in debug mode with some memory checking) on a CUDA-enabled GeForce GPU with Bumblebee, I get the following error: CUDA error 6: the launch timed out and was terminated This seems to be caused by the NVIDIA driver's watchdog. A solution is available here. However, why is this happening while using Bumblebee and optirun to run a simple CUDA kernel (i.e. I do not use my NVIDIA GPU for display)? The command I used to launch the program is: optirun [cuda

NVIDIA Optimus card not switching under OpenGL

爱⌒轻易说出口 提交于 2019-12-05 18:45:34
When I used use "glGetString(GL_VERSION)" and "glGetString(GL_SHADING_LANGUAGE_VERSION)" to check the OpenGL version on my computer, I got the following information: 3.1.0 - Build 8.15.10.2538 for GL_VERSION 1.40 - Intel Build 8.15.10.2538 for GL_SHADING_LANGUAGE_VERSION When I ran "Geeks3D GPU Caps Viewer", it shown the OpenGL version of my graphics cards(NVS 4200M) are GL_VERSION: 4.3.0 GLSL version: 4.30 NVIDIA via Cg compiler Does that mean my graphics cards only supports some OpenGL 4.3.0 functions, and I cannot create 4.3 context? Your graphics card is an NVIDIA Optimus card. This means

Forcing hardware accelerated rendering

时光毁灭记忆、已成空白 提交于 2019-12-05 00:11:30
I have an OpenGL library written in c++ that is used from a C# application using C++/CLI adapters. My problem is that if the application is used on laptops with Nvidia Optimus technology the application will not use the hardware acceleration and fail. I have tried to use the info found in Nvidias document http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf about linking libs to my C++-dll and exporting NvOptimusEnablement from my OpenGL-library but that fails. I guess I have to do something with the .exe not with the .dlls linked to the .exe

Resigning system.img on a device

末鹿安然 提交于 2019-12-03 08:56:18
问题 I am working on an automatic app updating solution for devices (LG p509 - Optimus 1) which we deploy to our customers. We have control of these devices and currently install a custom kernel on them (but not a full custom ROM). Since we are trying to do auto-updating of our app on the device, we need the system to be signed by a key which we control so we can sign our apps with the same key (to get the INSTALL_PACKAGES permission). I have been having a few issues running AOSP builds on the

Resigning system.img on a device

∥☆過路亽.° 提交于 2019-12-02 23:02:10
I am working on an automatic app updating solution for devices (LG p509 - Optimus 1) which we deploy to our customers. We have control of these devices and currently install a custom kernel on them (but not a full custom ROM). Since we are trying to do auto-updating of our app on the device, we need the system to be signed by a key which we control so we can sign our apps with the same key (to get the INSTALL_PACKAGES permission). I have been having a few issues running AOSP builds on the device (using the LG released source for the device), and am trying to take a step back and evaluate our

Enable/disable Optimus/Enduro in cross platform manner

北城余情 提交于 2019-12-02 03:44:42
In order to save power it is common in recent graphics architectures to dynamically switch between a discrete high-performance and an integrated lower-performance GPU, where the high-performance GPU is only enabled when the need for extra performance is present. This technology is branded as nvidia Optimus and AMD Enduro for the two main GPU vendors. However due to the non-standardized way in which these technologies work, managing them from a developer's perspective can be a nightmare. For example in this PDF from nvidia on the subject, they explain the many intricacies, limitations and

Forcing NVIDIA GPU programmatically in Optimus laptops

↘锁芯ラ 提交于 2019-11-27 11:49:34
I'm programming a DirectX game, and when I run it on an Optimus laptop the Intel GPU is used, resulting in horrible performance. If I force the NVIDIA GPU using the context menu or by renaming my executable to bf3.exe or some other famous game executable name, performance is as expected. Obviously neither is an acceptable solution for when I have to redistribute my game, so is there a way to programmatically force the laptop to use the NVIDIA GPU? I've already tried using DirectX to enumerate adapters (IDirect3D9::GetAdapterCount, IDirect3D9::GetAdapterIdentifier) and it doesn't work: only 1

Forcing NVIDIA GPU programmatically in Optimus laptops

|▌冷眼眸甩不掉的悲伤 提交于 2019-11-26 15:46:45
问题 I'm programming a DirectX game, and when I run it on an Optimus laptop the Intel GPU is used, resulting in horrible performance. If I force the NVIDIA GPU using the context menu or by renaming my executable to bf3.exe or some other famous game executable name, performance is as expected. Obviously neither is an acceptable solution for when I have to redistribute my game, so is there a way to programmatically force the laptop to use the NVIDIA GPU? I've already tried using DirectX to enumerate