[SDL] I Need a Game Interpreter

Tony Leotta tonyl at madscifi.com
Sun Sep 1 21:11:00 PDT 2002


Is this off topic?  Are we allowed to plow into deep Interpreter issues
on this list?

If so:
1) I wonder how you would limit each object to just a specific time
slice.  Seems like you would have to create a better scheduling
algorithm than more operating systems.

Also I have been thinking about ways to optimize the way time slices
would be grated...I guess more like time slice reduced based on
proximity.

So if one unit detects an close enemy unit...it is pointless to have
ALL the units do a LOS...just let all the units know...and enemy has
been detected.

Then each unit can decide if it is going to shot...no...just tell 'em
all to shot.

2) Just creating tons of threads and having each "object" run full bore
in a thread, that calls an interpreter seems like it would not scale.

I mean how many threads running Ruby or Python could you have?

3) If we take a step back however....and don't think about REAL-TIME...
(I know,  perish the thought)  But what if we consider a class of games
that does not require real-time graphics, no 3D, not even 	iso-hex.

No animation.  No nothing.  Think 1982 to 1987.

The issue should turn to one of...AGENT based AI's based on C++ Game
Engine that use embedded interpreted language.

So now..Garbage collection is not an issue. 

And now each object gets 100% CPU until it completes....

A turn could last 1 second or 3 minutes.

Think chess...deep long calculations.


4) Where does SDL come into play?

Well..that is simple..for smooth map scrolling that is super fast.

Have you ever tried to write a game using Windows GDI?  It is a joke.
And Direct X...with out the SDL wrapper is UGLY.

Ok...so now with all the parameters...I poise the question again...but
now from the point of C++ to embedded language binding...


What is the BEST scripting language that can be integrated into a C++
program that will use SDL?

The scripting language should be powerful with objects, and should be
able to be byte-code compiled for speed.

The language should support hash tables, dictionaries, and even complex
data structures.

Equally... the scripting language  should be able to make calls to
functions written in  C++ that provide features such as path finding,
LOS, LOF, and Global Simulation Environment data sharing.

But above all...the scripts shall be viewed as agents that have no more
or no less access to game state data as a human player.  Think BOT.  So
you could code up a AI that would win or lose but it could not cheat
because it has only as much data as it is allowed to view....

The data sharing between the scripting language  and the C++ game engine
MUCT be fast...but is also MUST have some checks.  

Example:  The AI trys to access data on terrain that is hidden in a fog
of war...this is not allowed.


Thanks,
-Tony






-----Original Message-----
From: sdl-admin at libsdl.org [mailto:sdl-admin at libsdl.org] On Behalf Of
Rainer Deyke
Sent: Sunday, September 01, 2002 11:39 PM
To: sdl at libsdl.org
Subject: Re: [SDL] I Need a Game Interpreter

Matthew Bloch wrote:
> From experience: don't *ever* use an automatic, garbage-collected
> runtime for a game.  I've spent many painful hours with Ruby trying
> to stop it from messing up my frame rate and only partially
> succeeded, much as I love the language for other jobs.  Once you've
> built your foundation on such a runtime there's very little you can
> do about it when you notice your game jerks every few frames.  Mark
> and sweep garbage collection involves forgetting about memory that
> you've allocated and every so often traversing an enormous tree of
> pointers to reclaim some memory.  If you don't do it, or delay it
> much, your program will run out of memory.  Do it as often as the
> language wants, and you'll never get a consistent frame rate.
> There's also nothing you can do to interrupt it-- once the recovery
> algorithm starts it must run to completion or start from scratch.

Do it once per frame, and your frame rate will be consistent (although
possibly too slow to be useful).

> Python uses reference counting to automatically free memory, and so
> the work of reclaiming memory is spread evenly over execution,
rather
> than concentrated into long lumps as with GC.

Note that Python still uses garbage collection to deal with cyclic
references.


--
Rainer Deyke | root at rainerdeyke.com | http://rainerdeyke.com


_______________________________________________
SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl






More information about the SDL mailing list