[SDL] SDL_filesystem

Jiri Dluhos dluhos at humusoft.com
Fri Apr 13 05:08:58 PDT 2001

Some programs such as Tux Racer dynamically increase the resolution or
detail in order to make up for the difference in time. It uses gettimeofday
on linux platforms and SDL_GetTicks() on Windows platforms (if I recall
correctly).. I would bet that it would appear choppier under windows for
this reason..

I'm working with SDL primarily on the windows platform. My raycasting engine
uses SDL_GetTicks() to measure the difference between each time-slice and
adjust how fast to move or rotate accordingly, and I dont think its perfect
(it seems a bit choppy). For example if I measure the time time it takes to
render a complete frame by subtracting newtick - oldtick, that if its only
at a 1 ms resolution and its running at 32fps (for example), thats 1000 / 36
or 27.77777 millisecond difference, of course it would only measure 27. So
at a complete frame, thats 36*27 or 972, with a difference of 22
milliseconds.. which is a 2% accuracy loss. At 48 FPS, it would only
return 20 ms, so 20 * 48 = 960, 1000-960=40, and 40/1000 = 4% accuracy loss.

Right now my raycasting engine (software rasterizer) is a little slow and
needs to be rewritten, but trying to compensate for different rendering
speeds might prove challenging with an unpredictable accuracy loss after I
do this. I mean, I could sample the framerate every other frame, but I think
thats still gambling on the same problem as before.. to get a truely smooth
rendering I'd need to compensate based on what it, the fps is at, now. Then
again it might not matter once I speed this sucker up.

Anyone run into this issue?

Matt Johnson

More information about the SDL mailing list