问题
I am currently working on a very simple game using a pure C approach with SDL (and its official extra libraries such as SDL_image) and OpenGL. Right now though I've hit a bit of a stumbling block and I have no idea why it is doing this: the colors are all off when drawn. I am running the program on a Mac currently, but if I remember correctly when I run it in Windows the colors are closer to being correct, but there are still some odd things happening (such as a pure white polygon drawing as yellow).
Currently on my Mac all images that are loaded as png files have their color drawn a bit off, and a pure white polygon is drawn as dark green. There are also a few images that are drawn pure white. If I remember correctly on Windows, the images are drawn correctly but the white polygon is yellow, as mentioned earlier. Now I'll post the pertinent code for initialization and loading and such.
int main( int argc, char *argv[] ) {
//initializing various OpenGL attributes with SDL utilities
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 ); //needs for 3D
SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 ); //only needed for systems other than mac osx
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
/* Set 640x480 video mode */
screen=SDL_SetVideoMode(screen_width,screen_height, 8, videoflags );
if (screen == NULL) {
fprintf(stderr, "Couldn't set %dx%dx%d video mode: %s\n",
screen_width, screen_height,
video_info->vfmt->BitsPerPixel, SDL_GetError());
exit(2);
}
glShadeModel( GL_SMOOTH );
glClearColor( 0.3f, 0.3f, 0.3f, 0 );
glViewport( 0, 0, (GLsizei)screen->w, (GLsizei)screen->h );
glMatrixMode( GL_PROJECTION );
glLoadIdentity();
gluOrtho2D( 0.0f, screen->w, 0.0f, screen->h );
/*glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );*/
glEnable( GL_ALPHA_TEST );
glAlphaFunc( GL_GREATER, 0.1f );
// Some basic initialization stuff goes here
// Assume infinite while loop goes here, except when esc is pressed
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glMatrixMode( GL_MODELVIEW );
glLoadIdentity();
glEnable( GL_TEXTURE_2D );
draw_gui();
// Things get drawn here
glDisable( GL_TEXTURE_2D );
SDL_GL_SwapBuffers();
SDL_Delay( 10 );
// End of while loop, clean up, etc.
}
Now I'll show the code that actually loads an image into memory:
Sprite *gen_sprite( char *file ) {
SDL_Surface *buffer = IMG_Load( file );
if( buffer == NULL ) {
fprintf( stderr, "Could not load image '%s'\n for reason: %s\n",
file, IMG_GetError() );
exit( 3 );
}
return gen_sprite_from( buffer );
}
Sprite *gen_sprite_from( SDL_Surface *buffer ) {
Sprite *sprite;
GLuint texture;
if( buffer == NULL ) {
fprintf( stderr, "NULL surface passed to gen_sprite_from." );
exit( 3 );
}
texture = gen_Gl_texture_from( buffer );
if( ( sprite = malloc( sizeof( Sprite ) ) ) == NULL ) {
fprintf( stderr, "Malloc failed to allocate space for a Sprite.\n" );
exit( 1 );
}
if( ( sprite->tex = malloc( sizeof( GLuint ) ) ) == NULL ) {
fprintf( stderr, "Malloc failed to allocate space for a GLuint.\n" );
exit( 1 );
}
sprite->tex[ 0 ] = texture;
sprite->original = buffer;
sprite->is_animation = 0;
sprite->cur_frame = 0;
sprite->cur_time = 0;
sprite->num_frames = 1;
sprite->frame_time = NULL;
return sprite;
}
Uint32 gen_Gl_texture_from( SDL_Surface *buffer ) {
GLuint texture;
SDL_Surface *temp;
glPixelStorei( GL_UNPACK_ALIGNMENT, 4 );
glGenTextures( 1, &texture );
temp = SDL_CreateRGBSurface( SDL_SWSURFACE, buffer->w, buffer->h, 32,
0x000000FF, 0x0000FF00, 0x00FF0000, 0xFF000000 );
SDL_SetAlpha( buffer, 0, SDL_ALPHA_OPAQUE );
SDL_BlitSurface( buffer, NULL, temp, NULL );
glBindTexture( GL_TEXTURE_2D, texture );
//glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );
//glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE );
gluBuild2DMipmaps( GL_TEXTURE_2D, 4,
temp->w, temp->h,
GL_RGBA, GL_UNSIGNED_BYTE,
temp->pixels );
SDL_FreeSurface( temp );
// This just creates white blocks instead of actually loading textures
//glPixelStorei( GL_UNPACK_ALIGNMENT, buffer->format->BytesPerPixel );
//glGenTextures( 1, &texture );
//glBindTexture( GL_TEXTURE_2D, texture );
//glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
//glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
//glTexImage2D( GL_TEXTURE_2D, 0, mask_order, buffer->w, buffer->h, 0,
// mask_order, GL_UNSIGNED_BYTE, buffer->pixels );
return texture;
}
At this point, I believe all the code that is pertinent for why colors would be skewed has been posted. Drawing code is very simple and just involves something such as a translation or rotation, binding a texture, and then a simple begin/end block with texcoords and vertices. If anyone can tell me why colors are off, and let me know a good way to make sure that colors are always correct in a cross-platform way (I plan to build on all platforms, which is part of the reason I'm using SDL) I would really appreciate it.
回答1:
Your channels are getting mixed up.
The problem is evident in the following lines:
temp = SDL_CreateRGBSurface( SDL_SWSURFACE, buffer->w, buffer->h, 32,
0x000000FF, 0x0000FF00, 0x00FF0000, 0xFF000000 );
and:
gluBuild2DMipmaps( GL_TEXTURE_2D, 4,
temp->w, temp->h,
GL_RGBA, GL_UNSIGNED_BYTE,
temp->pixels );
You are specifying a 32-bit texture in SDL made of 32-bit units, and then you are specifying an OpenGL texture as 4 8-bit components. This may give you correct or incorrect results depending on your architecture's endian. The solution is to specify OpenGL textures not as GL_UNSIGNED_BYTE
but as GL_UNSIGNED_INT
as follows:
gluBuild2DMipmaps( GL_TEXTURE_2D, 4,
temp->w, temp->h,
GL_RGBA, GL_UNSIGNED_INT_8_8_8_8,
temp->pixels );
Or maybe GL_UNSIGNED_INT_8_8_8_8_REV
is correct, I'm too lazy to figure out which one matches your SDL texture specification. Or you can change the SDL texture specification. A similar format (actually GL_BGRA
/ GL_UNSIGNED_INT_8_8_8_8_REV
, I think, I might be wrong) matches a commonly supported internal format exactly and will be slightly faster to upload to your graphics card but you won't notice unless you upload a lot of textures.
As a stylistic note, it is generally preferred to specify the internal format more exactly, like this:
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGBA8, ....
来源:https://stackoverflow.com/questions/6288267/colors-are-off-in-sdl-program