First post, by Jo22
- Rank
- l33t++
Hi everyone,
I was going to reply to an interesting topic, but it vanished while I was writing/typing.
So I create one myself here, since I think it's an interesting topic, I think. Hope that's okay.
The question is: So why do early VGA cards, IBMs original and the clones do have a lumpy 256KB that's useless for anything except what we call "Standard VGA"?
You know, that pseudo NTSC TV resolution with merely a nibble of colour. 😉
Personally, I think that the 256KB configuration of early wasn't ideal. 640x480 @60 Hz, 31.5 KHz in 256 colour was the real deal, what VGA was meant to be.
The best evidence is IBM PGA/PGC, Professional Graphics Controller.
It had this specifications and provided the foundation to VGA's analogue specifications.
I assume, it's all because of EGA. EGA's full video memory was 256 KB in capacity. That's why VGA used it as a default, standard.
However, VGA seems has provisions for more memory.
There's one register bit indicating "512KB or more", if memory serves.
Most EGA and VGA clones outperformed their IBM originals in terms of resolution/memory.
Super EGA cards existed, for example.
Many of them, if not all, incorporated standard VGA in 640x480 in 16c.
Because, both EGA and VGA use same palette in 16 colour mode: No special initialization is necessary and 256KB of Video RAM are sufficient.
That's why the original Super VGA mode, 800x600 @56Hz in 16c was so widespread.
Except for the mode number, which varied among different models (until VBE came along), the initializing was the same.
Well, most of the time. Exceptions surely exist, but most VGA cards simply kept the normal memory layout of Standard VGA/EGA.
Funnily, there's one VBE mode in 256c that can be used by low-end VGAs:
640x400 @70 Hz in 256c - VESA VBE 1.0 mode 100h
Some games even used that, however, it has one disadvantage - it requires banked memory access, afaik.
Ie, the video modes exceeds 64KB and must use some kind of bank-switching (like EMS).
That's the main reason why game programmers were so obsessed with MCGA's mode 13h, 320x200 pels in 256c (320x400 actually, because line doubling is enabled).
If it wasn't for 320x200's ability to fit in a single x86 memory segment, with each pixel of colour addressable, PC gaming could have been so much higher end.
Or less popular, if we're pessimistic.
Growing up with the shareware scene and Windows 3.1x, I later learned how pixelated and low-res the majority of (commercial) DOS games were by contrast.
While shareware games made by one man team were much simpler in design, they used the 16c modes to their fullest.
That's why ironically EGA was such a nice thing at the time.
Users/programmers with EGA PCs did value low-colour art much more than most of us did.
Like the Japanese users/developers they focused on using extra resolution rather than colour.
Games like Zeliard look much prettier in 640x350 or 640x200 than in 256c but MCGA/320x200 (IMHO).
Standard EGA gives them some sort of grace, of elegance.
These are just my two cents, of course.
Any ideas welcome.
Edit: Typos fixex.
"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel
//My video channel//