SDL_GL_SwapBuffers() is intermittently slow

匿名 (未验证) 提交于 2019-12-03 02:25:01

问题:

I have a sdl/opengl game I am working on for fun. I get a decent fps on average, but movement is really choppy because SDL_GL_SwapBuffers() will randomly take a crazy long amount of time to process. With textures loaded and written to the buffer sometimes it will take over 100ms! I cut out a lot of my code to try and figure out if it was something I did wrong but I haven't had much luck. When I run this bare bones program it will still block for up to 70ms at times.

Main:

// Don't forget to link to opengl32, glu32, SDL_image.lib  // includes #include <stdio.h>  // SDL #include <cstdlib> #include <SDL/SDL.h>  // Video #include "videoengine.h"  int main(int argc, char *argv[]) {     // begin SDL     if ( SDL_Init(SDL_INIT_VIDEO) != 0 )     {         printf("Unable to initialize SDL: %s\n", SDL_GetError());     }      // begin video class     VideoEngine videoEngine;      // BEGIN MAIN LOOP     bool done = false;     while (!done)     {         int loopStart = SDL_GetTicks();          printf("STARTING SWAP BUFFER : %d\n", SDL_GetTicks() - loopStart);         SDL_GL_SwapBuffers();           int total = SDL_GetTicks() - loopStart;         if (total > 6)             printf("END LOOP  : %d ------------------------------------------------------------>\n", total);         else              printf("END LOOP  : %d\n", total);      }     // END MAIN LOOP      return 0; } 

My "VideoEngine" constructor:

    VideoEngine::VideoEngine() {     UNIT = 16;     SCREEN_X = 320;     SCREEN_Y = 240;     SCALE = 1;       // Begin Initalization          SDL_Surface *screen;          SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );  // [!] SDL_GL_SetAttributes must be done BEFORE SDL_SetVideoMode          screen = SDL_SetVideoMode( SCALE*SCREEN_X, SCALE*SCREEN_Y, 16, SDL_OPENGL );  // Set screen to the window with opengl         if ( !screen )  // make sure the window was created         {             printf("Unable to set video mode: %s\n", SDL_GetError());         }          // set opengl state         opengl_init();      // End Initalization  }  void VideoEngine::opengl_init() {     // Set the OpenGL state after creating the context with SDL_SetVideoMode          //glClearColor( 0, 0, 0, 0 );                             // sets screen buffer to black         //glClearDepth(1.0f);                                     // Tells OpenGL what value to reset the depth buffer when it is cleared         glViewport( 0, 0, SCALE*SCREEN_X, SCALE*SCREEN_Y );     // sets the viewport to the default resolution (SCREEN_X x SCREEN_Y) multiplied by SCALE. (x,y,w,h)         glMatrixMode( GL_PROJECTION );                          // Applies subsequent matrix operations to the projection matrix stack.         glLoadIdentity();                                       // Replaces the current matrix with the identity matrix         glOrtho( 0, SCALE*SCREEN_X, SCALE*SCREEN_Y, 0, -1, 1 ); //describes a transformation that produces a parallel projection         glMatrixMode( GL_MODELVIEW );                           // Applies subsequent matrix operations to the projection matrix stack.         glEnable(GL_TEXTURE_2D);                                // Need this to display a texture         glLoadIdentity();                                       // Replaces the current matrix with the identity matrix         glEnable(GL_BLEND);                                     // Enable blending for transparency         glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);      // Specifies pixel arithmetic         //glDisable( GL_LIGHTING );                               // Disable lighting         //glDisable( GL_DITHER );                                 // Disable dithering         //glDisable( GL_DEPTH_TEST );                             // Disable depth testing          //Check for error         GLenum error = glGetError();         if( error != GL_NO_ERROR )         {          printf( "Error initializing OpenGL! %s\n", gluErrorString( error ) );         }      return; } 

I'm starting to think possibly I have a hardware issue? I have never had this problem with a game though.

回答1:

SDL does use the SwapIntervalEXT extension so you can make sure that the buffer swaps are as fast as possible (VSYNC disabled). Also, buffer swap is not a simple operation, OpenGL needs to copy contents of back buffers to front buffers for case that you want to glReadPixels(). This behavior can be controlled using WGL_ARB_pixel_format, using WGL_SWAP_EXCHANGE_ARB (you can read about all this stuff in the specs; now I'm not sure if there is an alternative to that for Linux).

And then on top of all that, there is the windowing system. That can actually cause a lot of trouble. Also, if some errors are generated ...

This behavior is probably ok if you're running on a small mobile GPU.

SDL_GL_SwapBuffers() only contains a call to glxSwapBuffers() / wglSwapBuffers() so there is no time spent in there.



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!