K1n9_Duk3 wrote:VileRancour wrote:But even using 320x200 at 16 colors, VGA does let you customize the palette, while EGA doesn't.
Yes, I know. I've played Duke Nukem II, as you can see from my avatar and username. That game, along with The Blues Brothers and Prehistorik, was memorable for using EGA video mode 0x0D with a custom VGA palette on top. I usually go thinking EGA = 16 colors and VGA = 256 colors, although that is not entirely correct. But the point is that using 16-color video modes requires different coding than (mode 0x13) VGA. Without having a decision on that, you don't need to start programming graphics routines.
When talking about not being able to customize EGA palette; you mean actual EGA card? Because on actual EGA card, you can select all 16 colors to be anything from the available 64 colors (6-bit colors, two bits per RGB color component). On a VGA card, you can still use EGA mode with 16 colors, but you can also reprogram the DAC palette to have any 16 colors from the 262144 available colors. On some games, this is the only difference between EGA and VGA modes, they still use 16-color mode but will only reprogram the DAC palette on a VGA.
And some tricks can be played on 16-color mode, for example some games have top part (playfield) and bottom part (inventory, status, etc) and they reprogram the palette so that top and bottom parts have different 16-color palette to get 32 colors.
K1n9_Duk3 wrote:
My point was that all images (such as sprites, tileset graphics or background images) take twice as much memory when using 256 colors instead of 16 colors. A game will usually have a lot of graphics, so you really should think twice about this.
Yes, as it was discussed before, one 64 kilobyte segment for tiles buys you 256 16x16 tiles with 256 colors or 512 16x16 with 16 colors. And since the image data is halved, it can be copied twice as fast, so you can have either twice the stuff on screen or can use half as slow CPU when tiles and video mode are 16-color. I agree, this decision has a huge effect.
There's also a possibility to use 16 color tiles, and 256 color video mode, so that each tileset (foreground, background, characters) can have their own 16 color palette. It will allow less memory usage for tiles and look better but will require just as much bandwidth writing to video memory than 256 color mode.
K1n9_Duk3 wrote:
But keep in mind that MIDI playback also needs to store additional data in memory, like OPL instruments, multiple track pointers and such. (In my example that's about 900 bytes for MIDI variables versus 25 bytes for IMF variables.) IMF doesn't need any of that, because it is basically a recording of the OPL output of a MIDI/CMF/whatever player. This makes IMF easy to implement and reduces CPU usage to a minimum during playback. You could also use DRO format if you don't like IMF, but I don't know much about that format.
I wrote a very barebones command-line player for IMF files under 400 bytes 😀 Since some of the file code and OPL chip code is not only purely for IMF routines, the IMF playing itself might be somewhere along 300 bytes. Yeah, playing MIDI could be too complex and time consuming, unless the MIDI file is limited to being single-track and using fixed instruments per single MIDI channel.
Current DRO version is similar to IMF, there's only two differences. Most important is that IMF supports only single OPL2 (Adlib), while DRO supports also OPL3 and dual OPL2 for stereo music, this is done via register lookup table (indices 0-127 mapped to registers of first OPL and 128-255 mapped to second OPL). Another difference is the timing, DRO has 1000Hz ticks while IMF has several default rates, most often 700Hz used in Wolf3D for example). So in the end there is not much difference which of these formats are used. If I must say something negative, they are both about equally bad for storing music that was originally meant to play in a 60 Hz vertical retrace timer interrupt, because 1000 and 700 are not multiples of 60. This is where RDOS RAW files excel.