Reply 20 of 110, by Jo22
- Rank
- l33t++
HanSolo wrote on 2023-10-03, 18:54:rasz_pl wrote on 2023-10-03, 14:11:286 was EGA territory, nothing requiring VGA and more than 1MB would be playable with one exception being Windows.
I think I played many games on my 286 back then that supported VGA, so for me that was always the beginning of the VGA-era. But the 286 covered a pretty long time span
+1
According to my books, VGA compatible cards hit the market in February 1988.
Considering the age of the 80286 (1982) and AT (1984),
they saw a whole range of standards come and go.
Though after 1987, a new star was born - VGA.
It was sought after equally among all users, including XT users.
That's why 8-Bit ISA VGAs were produced in notable numbers, I assume.
Though video game players mainly had an interest in mode 13h or MCGA, respectively, I suppose.
What's also notable, VGA was designed as an integrated circuit that could ease production of graphics cards.
The industry adopted VGA and turned it into an open standard, like it happened with Hercules compatibles. It was "cheap" in other words.
EGA, by comparison, was never really meant to be mainstream. It was highly complex and rather professional. Required a special monitor, when it was introduced.
EGA was like a prototype to VGA, maybe.
If we're looking at it from a technical point of view, then EGA and VGA share a lot of similarities.
CGA and Hercules are quite different to them, by contrast (EGA has very basic/artificial CGA support).
However, CGA and MCGA (mode 13h) are similar to each other again (esp. memory; edit: it's directly accessible, for example, I mean).
This is very interesting. Because mode 13h differs a lot from the other VGA modes.
It's as if MCGA was something alien that was integrated into VGA. Like an earlier standard, like CGA.
"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel
//My video channel//