[SDL] endianness in SDL_audio.c
mandin.patrice at wanadoo.fr
Thu Jan 27 10:44:59 PST 2005
Le Thu, 27 Jan 2005 10:49:25 -0500
Albert Cahalan <albert at users.sf.net> a écrit:
> The root of the problem is that developers use AUDIO_S16
> when they should be using AUDIO_S16SYS. I recently fixed
> Tux Paint, then patched up a wiki somewhere to warn about
> the issue. App developers are pretty much led into using
> AUDIO_S16 instead of the correct AUDIO_S16SYS.
> As far as I can tell, one should never use AUDIO_S16 in
> app code. It should be removed from the headers I think.
> SDL_mixer.h has a MIX_DEFAULT_FORMAT that really should be
> defined as AUDIO_S16SYS.
I agree. Seeing something like this in SDL_audio.h:
/* Audio format flags (defaults to LSB byte order) */
#define AUDIO_U16 AUDIO_U16LSB
#define AUDIO_S16 AUDIO_S16LSB
is not ok, when you have AUDIO_[U|S]16SYS correctly defined for
endianness. And the comment about the default byte order should be
removed. Why default to LSB byte order ?
SDL_wave.c should also be patched, to change AUDIO_S16 to AUDIO_S16LSB for
spec->format. Better: check that MS_ADPCM_decode() and IMA_ADPCM_decode()
functions really decode in AUDIO_S16LSB, not in the native format. I
wonder if it decodes (wrongly!) to AUDIO_S16MSB on a big endian machine,
whereas the spec->format is set to AUDIO_S16[LSB].
Programmeur Linux, Atari
Spécialité: Développement, jeux
More information about the SDL