[SDL] strange context switch when changing Window-Size

Benjamin Grauer bensch at orxonox.ethz.ch
Thu Nov 17 02:32:11 PST 2005

Hi there

we are coding a 3d action shooter for Linux/Windows/osX, and since we 
ported to Windows via SDL we have a strange effect, when changing the 
Screen size in windows:

On loading everything works well, most models are stored in glLists (at 
the moment) and all Textures are stored in glTexture2D's...

But, when the resolution gets changed, SDL sends a WindowResizeEvent, 
that we catch, and handle with this code:

int GraphicsEngine::setResolution(int width, int height, int bpp)
  this->resolutionX = width;
  this->resolutionY = height;
  this->bitsPerPixel = bpp;

  if (this->screen != NULL)
  if((this->screen = SDL_SetVideoMode(this->resolutionX, 
this->resolutionY, this->bitsPerPixel, this->videoFlags | 
this->fullscreenFlag)) == NULL)
      PRINTF(1)("Could not SDL_SetVideoMode(%d, %d, %d, %d): %s\n", 
this->resolutionX, this->resolutionY, this->bitsPerPixel, 
this->videoFlags, SDL_GetError());
      //    SDL_Quit();
      //    return -1;
    glViewport(0, 0, width, height);                     // Reset The 
Current Viewport

Then all Textures get removed from rendered Objects, and glLists wont 
render anymore, but the objects rendered through glVertex, are still 

I suspect, that somehow an openGL context switch occurs, or gl gets 

here is a precompiled binary, and source.

any help is appreciated

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20051117/06240a18/attachment-0007.htm>

More information about the SDL mailing list