[SDL] NVIDIA & SDL_GL_SwapBuffers
wyatbar at iit.edu
Wed Jul 24 17:54:00 PDT 2002
Ldd output does not list libGL (wish I had thought to look at that) for
the apps which fail to work.
Furthermore, from what I can tell normal precompiled opengl apps/games
work just fine as well as any app I compile which uses glut to handle
the windowing/surface creation even if it uses SDL for other things. ( I
did some deeper research after you tipped me to the build environment ).
From: sdl-admin at libsdl.org [mailto:sdl-admin at libsdl.org] On Behalf Of
Sent: Wednesday, July 24, 2002 1:51 PM
To: sdl at libsdl.org
Subject: Re: [SDL] NVIDIA & SDL_GL_SwapBuffers
On Tue, Jul 23, 2002 at 02:35:39PM -0500, wyatbar at iit.edu wrote:
> If I use the binary compiled before I installed the NVIDIA GLX &
> binary distributions (not sources), the program works fine and is
> or seg faults depending on the program. If I use a binary compiled
> NVIDIA i get garbage in the window.
> I've found in my apps that it seems there is a problem with
> in that I cant even write a program that simple clears the buffer then
> (I compiles but I get the same garbage in the window).
> I have tried putting glFlush() right before the call as per a list
> a year ago, but it doesnt help.
> If it helps I get the same results with a GeForce3 ti500 on the same
> Has anyone else experienced this?
> More importantly do you know of a work around?
I've experienced this on Debian, but it was Debian's fault, not NVIDIA.
Do other OpenGL binaries work such as Quake 3 or precompiled versions of
games such as armagetron? It very well could be a problem with your
Additionally, the ldd output of your binary, does it contain a line
libGL.so.1.something? If not, I know exactly what the problem is. More
information is required to accurately diagnose your problem.
Joseph Carter <knghtbrd at bluecherry.net> <-- That boy needs
<markm> c++: the power, elegance and simplicity of a hand grenade
More information about the SDL