[SDL] render performance
kos at climaxgroup.com
Fri Nov 17 07:13:42 PST 2006
Thank you for your response...I think althought
direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2
there's something going on with the version of gl that my system is using.
It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff (which
I'm guessing is where my livna rpm nvidia driver stuff goes) rather than
If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is there...but
if I look in /usr/lib/nvidia it's missing.
I think my performance is very bad because nothing is hardware
Any ideas on how I can tell whether things use hardware acceleration or not?
From: sdl-bounces+kos=climaxgroup.com at libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of Matthias
Sent: 16 November 2006 16:29
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance
Tests like you are doing must be done carefully, fps is always a result
of many, many factors. Some of them can be overlooked easily.
Standard OpenGL Vertex Arrays are not that fast as they might look like.
Every frame the whole bunch of vertex data will be transferred to your
graphics card, since Vertex Arrays are stored in System Memory. Here
are some things, that might speedup your program.
- If possible, enabling glTexGen can speed up the app, since the
texture generation can by done by hardware.
- Maybe your frame rate is limited due to fill rate. 100k triangles
with every single triangle drawn over the half of the screen will surely
be slower than drawing 100k very small triangles all next to each other.
- Eliminate duplicate vertices
- Render GL_TRIANGLE_STRIPS instead of GL_TRIANGLES
- Make use of a extensions:
- GL_ARB_vertex_buffer_object to enable hardware vertex buffers.
- GL_EXT_compiled_vertex_array to let the driver know, that the
buffer is static
- GL_NV_vertex_array_range and GL_NV_vertex_array_range2 will use
DMA (but only with nVidia cards)
Anyway, 1200 FPS is a lot.
Hope that helps!
Kostas Kostiadis wrote:
> Hello all,
> I'm having some performance issues and I was wondering if any of you
> guys can figure out why.
> (I know it's a long shot, but you may spot something that I've missed).
> I've got a very simple mesh renderer and I'm using SDL with openGL.
> I'm passing a vertex pointer, normal pointer, and texture pointer and
> then calling glDrawElements from my app.
> I'm batching things up, so I'm only making 2 calls to glDrawElements
> per render loop (tbh I don't think that's where the problem is)
> Anyway, rendering just under 100K trianlges, this comes to about 3 FPS ;-(
> (This is with no texturing and lighitng enabled (with LIGHT0 only)).
> gprof says that 75% of the running time goes in my Render() call.
> This is on an old-ish PC running fedora6 with a GeForce FX 5500 and
> the latest nVidia drivers, but still, it should be running loooooads
> faster than this. To give you an idea...glxgears comes up with 1200 FPS.
> One thing that is still a mystery to me, is why SDL_GetError returns
> this at start-up:
> Failed loading glXGetSwapIntervalMESA: /usr/lib/nvidia/libGL.so.1:
> undefined symbol: glXGetSwapIntervalMESA
> Does this mean anything to anyone? Is it related to my performance issues?
> Here's how I'm initialising SDL (in case I'm doing something wrong)...
> Thanx in advance for any help,
> SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
> SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
> SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
> SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
> SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
> SDL_GL_SetAttribute( SDL_GL_BUFFER_SIZE, 24 );
> SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );
> SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
> SDL_GL_SetAttribute( SDL_GL_ACCUM_RED_SIZE, 16 );
> SDL_GL_SetAttribute( SDL_GL_ACCUM_GREEN_SIZE, 16 );
> SDL_GL_SetAttribute( SDL_GL_ACCUM_BLUE_SIZE, 16 );
> SDL_GL_SetAttribute( SDL_GL_ACCUM_ALPHA_SIZE, 16 );
> SDL_GL_SetAttribute( SDL_GL_STEREO, 0 );
> SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 0 );
> SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, 0 );
> unsigned int flags = SDL_OPENGL;
> if (mRenderSetupData.IsFullScreen)
> flags |= SDL_FULLSCREEN;
> else if (mRenderSetupData.HasNoFrame)
> flags |= SDL_NOFRAME;
> SDL_Surface *screen = SDL_SetVideoMode( 800, 600, 0, flags );
> if (screen == 0 )
> ErrorMsg("Video mode set failed: %s\n", SDL_GetError());
> SDL mailing list
> SDL at libsdl.org
SDL mailing list
SDL at libsdl.org
More information about the SDL