vsync

Using SwapBuffers() with multiple OpenGL canvases and vertical sync?

怎甘沉沦 提交于 2019-12-08 08:01:44
问题 I have a GUI written using wxPython that contains two GLCanvas es, a 'display' canvas and a 'preview' canvas onto which I am drawing some very simple geometry using PyOpenGL. The 'preview' and 'display' canvases display the contents of the same framebuffer: I need both of these canvases to be updated synchronously at a consistent framerate with no tearing. So far I have just been calling self.SetCurrent() # draw stuff... self.SwapBuffers() for both the preview and display canvases within my

Turned off VSync but still getting 60FPS in my DirectX 9 application

心已入冬 提交于 2019-12-08 05:43:53
问题 I have an DirectX9 application which only renders a triangle on the screen, but I am getting a frame rate of 60 FPS no matter if I've got VSync on or not. Why is this? Here is the code I've done to calculate the FPS, but I dont know if this is the problem to it. GameTimer.h #pragma once #include "Windows.h" class GameTimer { public: GameTimer(); ~GameTimer(){} void Update(); float GetFrameTime(); inline float GetFramePerSec(){return framesPerSec;} inline float GetMillSecPerFrame(){return

Custom vsync Algorithm

十年热恋 提交于 2019-12-08 03:49:44
问题 When creating a game with any programming language (that can), it is important to have a fixed target frame rate the game will redraw the screen at, however some languages either do not have a sync function or timers are unreliable so is there any method of keeping the frame rate steady manually with only math and/or by sleeping the thread? Maybe using a frame delta? So far I have only tried 'sleep(targetframerate - (targetframerate-delta))' This is supposed to realise that the previous frame

How to query Vsync phase in Linux

假如想象 提交于 2019-12-07 14:18:17
问题 I need to create a C++ function that will return the number of seconds until the next Vsync interval as a floating point value. Why? I am creating programs that display rectangles that follow the mouse cursor. Ostensibly OpenGL provides a vsync mechanism in the glXSwapBuffers function, but I have found this to be unreliable. With some card drivers you get vsync; with others you don't. On some you get vsync but you also get an extra 2-frames of latency. But this is not a bug in OpenGL. The

How to enable VSYNC in D3D windowed app?

淺唱寂寞╮ 提交于 2019-12-06 07:18:18
问题 So, Im using D3D in a windowed application. I inited D3D with the following parameters: windowed: true; backbufferformat: D3DFMT_X8R8G8B8; presentinterval: D3DPRESENT_INTERVAL_ONE; swapeffect: DISCARD Each time OnPaint is called, I render the image to the backbuffer and present it to front. As far as I know (and so does MSDN say), once I set D3DPRESENT_INTERVAL_ONE , vsync will work. But in this case, the image is teared when dragging horizontally. (It seems there's a line across the image,

How do you know what you've displayed is completely drawn on screen?

不问归期 提交于 2019-12-06 03:31:29
Displaying images on a computer monitor involves the usage of a graphic API, which dispatches a series of asynchronous calls... and at some given time, put the wanted stuff on the computer screen. But, what if you are interested in knowing the exact CPU time at the point where the required image is fully drawn (and visible to the user)? I really need to grab a CPU timestamp when everything is displayed to relate this point in time to other measurements I take. Without taking account of the asynchronous behavior of the graphic stack, many things can get the length of the graphic calls to jitter

How to avoid tearing with pygame on Linux/X11

╄→гoц情女王★ 提交于 2019-12-06 01:39:23
问题 I've been playing with pygame (on Debian/Lenny). It seems to work nicely, except for annoying tearing of blits (fullscreen or windowed mode). I'm using the default SDL X11 driver. Googling suggests that it's a known issue with SDL that X11 provides no vsync facility (even with a display created with FULLSCREEN|DOUBLEBUF|HWSURFACE flags), and I should use the "dga" driver instead. However, running SDL_VIDEODRIVER=dga ./mygame.py throws in pygame initialisation with pygame.error: No available

Synchronizing multiple OpenGL windows to vsync

本秂侑毒 提交于 2019-12-05 20:14:21
问题 In a Windows application, I have multiple OpenGL windows open at the same time. Ideally I would like each of these to draw at 60 fps, synchronized to the screen refresh. For each render context, I'm calling wglSwapIntervalEXT(1) to turn on vsync. Each window has its own display thread, which draws the frame and then calls SwapBuffers to update. It turns out that the windows are 'fighting' each other: it looks like the SwapBuffers calls are synchronized and wait for each other, even though

How to disable vsync on macOS

偶尔善良 提交于 2019-12-05 12:50:28
问题 With all my SDL/OpenGL programs, the framerate is stuck at 60fps, so looks like the vsync is enable, but not by me, nor in my code or my settings. so i would like to now if there is a way to disable it, maybe in some deep macOS settings? 回答1: This enabled me to get around ~700 frames per second on my MacBook Pro. Download Graphics Tools for Xcode - Late August 2014 Install or just mount Graphic Tools Open Quartz Debug Go to Tools -> Show Beam Sync Tools Select Disable Beam Synchronization It

How can I get vsync callback on HTML5 canvas?

筅森魡賤 提交于 2019-12-05 01:32:09
问题 How can I vsync callback on HTML5 canvas? 回答1: There's no such thing. The browser should take care of doing the appropriate syncs, and you can help it by using requestAnimationFrame() - see, for example, http://paulirish.com/2011/requestanimationframe-for-smart-animating/ and http://webstuff.nfshost.com/anim-timing/Overview.html#dfn-sample-all-animations 回答2: * NEW 2017 answer * Although this is an old 2011 answer, and accurate for its time, the old answer of "There is no such thing" is