[SDL] [DEMO] Minimal world engine in SDL+OpenGL
phillips at arcor.de
Wed Jan 22 14:15:01 PST 2003
On Wednesday 22 January 2003 22:13, Stacey Keast wrote:
> On Wed, 2003-01-22 at 13:28, Stacey Keast wrote:
> > On Wed, 2003-01-22 at 12:20, Daniel Phillips wrote:
> > > Nice shooting. OK, your card doesn't support a 32 bit depth buffer, or
> > > the driver doesn't. The (pathetic excuse for) specs on NVidia's site
> > > say that your gforce4 supports a 32 bit depth buffer, so we've got a
> > > software problem.
> > This is what I do for my nvidia card.. I can do 32 bit stuff with it..
> > /etc/XF86Config
> > ---------------
> Hmm, nm, I can only get up to 24 with your demo. But I just tested out
> lesson17 from the OpenGL-intro package and changed everything to 32 bits
> there and it worked without complaints. (at fullscreen and not)
All of the the depths you mentioned are color depths, not frame buffer
depths. I don't know what madness the various layers are using to determine
which z-buffer depths are available, but I can say it makes very little sense.
There, now that I have that off my chest, It seems that SDL+OpenGL is only
going to start up on a wide range of hardware if you don't specify any
GL_SetAttributes at all. Just comment them all out and I guess it will
start, but you will likely have a sucky 16 bit zbuffer. Please try
uncommenting just the SDL_GL_DEPTH_SIZE line to get a 32 bit zbuffer and see
if it starts.
With Mesa software rendering on one of my machines here, SDL+OpenGL refuses
to create a surface with a 32 bit z-buffer. That makes no sense whatsoever -
somebody is dropping the ball, either XFree, Mesa, or SDL. Besides searching
for the guilty party in the driver chain, I suppose I also need to implement
a fallback strategy in my demo, where I keep trying wimpier configurations
until one of them returns a non-zero surface. Then in the .create method of
the one demo world that cares about the z-buffer depth (shadow), I can fail
out with a message and have that demo not run, or run with an empty world,
rather than run with artifacts. This strategy will hopefully draw some
attention to the configuration disconnects, rather than working around them
and pretending everything's fine.
I'll go take a look at the SDL part of the init path now, to see if anything
obvious jumps out at me.
More information about the SDL