video-card

Off screen rendering when laptop shuts screen down?

人盡茶涼 提交于 2019-12-06 19:43:40
问题 I have a lengthy number-crunching process which takes advantage of quite abit of OpenGL off-screen rendering. It all works well but when I leave it to work on its own while I go make a sandwich I would usually find that it crashed while I was away. I was able to determine that the crash occurs very close to the moment The laptop I'm using decides to turn off the screen to conserve energy. The crash itself is well inside the NVIDIA dlls so there is no hope to know what's going on. The obvious

On iOS, how do CALayer bitmaps (CGImage objects) get displayed onto Graphics Card?

这一生的挚爱 提交于 2019-12-05 11:52:09
On iOS, I was able to create 3 CGImage objects, and use a CADisplayLink at 60fps to do self.view.layer.contents = (__bridge id) imageArray[counter++ % 3]; inside the ViewController , and each time, an image is set to the view's CALayer contents , which is a bitmap. And this all by itself, can alter what the screen shows. The screen will just loop through these 3 images, at 60fps. There is no UIView's drawRect , no CALayer's display , drawInContext , or CALayer's delegate's drawLayerInContext . All it does is to change the CALayer's contents . I also tried adding a smaller size sublayer to self

Off screen rendering when laptop shuts screen down?

一曲冷凌霜 提交于 2019-12-05 01:14:32
I have a lengthy number-crunching process which takes advantage of quite abit of OpenGL off-screen rendering. It all works well but when I leave it to work on its own while I go make a sandwich I would usually find that it crashed while I was away. I was able to determine that the crash occurs very close to the moment The laptop I'm using decides to turn off the screen to conserve energy. The crash itself is well inside the NVIDIA dlls so there is no hope to know what's going on. The obvious solution is to turn off the power management feature that turns the screen and video card off but I'm

How to run build using graphics drivers by using optirun (Bumblebee) from IDE (Netbeans, Eclipse)?

夙愿已清 提交于 2019-12-01 12:50:24
Does anyone know how to make eclipse or netbeans use the graphics card in optimus laptops by invoking optirun (bumblebee) inside the IDE so that one can just use the run button in the IDE to run the program in a graphics card within the IDE. In simplest form I just want the IDE to do the equivalent of optirun ./javaproject The way I did this in Eclipse was to first start the Java debugger jdwp and listen to a port. Then start the JVM with optirun java ... and use jdwp to connect to this port. Both tasks can be started at the same time in Eclipse by creating a Launch Group in the debug

How to run build using graphics drivers by using optirun (Bumblebee) from IDE (Netbeans, Eclipse)?

南楼画角 提交于 2019-12-01 11:44:03
问题 Does anyone know how to make eclipse or netbeans use the graphics card in optimus laptops by invoking optirun (bumblebee) inside the IDE so that one can just use the run button in the IDE to run the program in a graphics card within the IDE. In simplest form I just want the IDE to do the equivalent of optirun ./javaproject 回答1: The way I did this in Eclipse was to first start the Java debugger jdwp and listen to a port. Then start the JVM with optirun java ... and use jdwp to connect to this

How to change 3rd monitor programmatically

痴心易碎 提交于 2019-12-01 07:17:22
When I'm using my laptop, I use 3 displays: The laptop display A second monitor (connected through VGA) A TV (connected through HDMI) My videocard doesn't support 3 monitors, so I'm constantly switching from 2 to 3: when I'm on the computer, I use the 2nd monitor, and when I want to watch some movies, etc. I use the 3rd. I currently have to go to Screen Resolution , select the monitor that is not in use, and choose Extend desktop to this display . Is there a way I can automate it? Is there any command-line tool, or any Windows API that allows doing it? Edit: Display Changer seems to do what I

How to read GPU (graphic card) temperature?

混江龙づ霸主 提交于 2019-11-30 05:16:05
I am interested in a way how to read GPU temperature (graphics processing unit, main chip of graphic card), by using some video card driver API? Everyone knows that there two different chip manufacturers (popular ones, at least) - ATI and nVIDIA - so there are two different kinds of drivers to read temperature from. I'm interested in learning how to do it for each different card driver. Language in question is irrelevant - it could be C/C++, .NET platform, Java, but let's say that .NET is preferred. Anyone been doing this before? For nVidia you would use nvcpl.dll . Here's the documentation:

Ctrl Alt F8 disconnects displays?

假装没事ソ 提交于 2019-11-30 04:39:17
Learning debugging in PhpStorm and keep accidentally hitting Ctrl + Alt + F8 . I use 3 displays this disconnects the two remotes and goes back to just the laptop. I cannot see this documented anywhere (running Win 10). The worst part is that hitting the combination again does NOT reconnect the displays. Anyone know either: combination to reverse the effect - IE reconnect displays how I can stop it or a tool that will help me find out where it is firing from (motherboard/Windows/Intel/nVidia/USB monitor driver - goodness knows where). Grateful for a helping hand. F8 in various combinations are

How to read GPU (graphic card) temperature?

穿精又带淫゛_ 提交于 2019-11-29 02:47:52
问题 I am interested in a way how to read GPU temperature (graphics processing unit, main chip of graphic card), by using some video card driver API? Everyone knows that there two different chip manufacturers (popular ones, at least) - ATI and nVIDIA - so there are two different kinds of drivers to read temperature from. I'm interested in learning how to do it for each different card driver. Language in question is irrelevant - it could be C/C++, .NET platform, Java, but let's say that .NET is

Get results of GPU calculations back to the CPU program in OpenGL

这一生的挚爱 提交于 2019-11-27 11:41:53
问题 Is there a way to get results from a shader running on a GPU back to the program running on the CPU? I want to generate a polygon mesh from simple voxel data based on a computational costly algorithm on the GPU but I need the result on the CPU for physics calculations. 回答1: Define "the results"? In general, if you're doing GPGPU-style computations with OpenGL, you are going to need to structure your shaders around the needs of a rendering system. Rendering systems are designed to be one-way: