VOGONS


Value in IBM AT vs a 386 for retro gaming?

Topic actions

Reply 80 of 84, by Eep386

User metadata
Rank Member
Rank
Member

At least EGA's text modes and clarity were generally a darn sight improved over CGA's.

Life isn't long enough to re-enable every hidden option in every BIOS on every board... 🙁

Reply 81 of 84, by Deunan

User metadata
Rank Oldbie
Rank
Oldbie
Anonymous Coward wrote on 2021-05-11, 04:59:

EGA was pretty much a disaster too. Except for a few odd games, almost nothing makes use of the colour palette because you couldn't use it in the low resolution modes...and Windows doesn't use it in high res mode.
Also, the EGA display is capable of displaying all 64 colours on screen at the same time, but despite having up to 256kb of display memory the EGA board offered no such mode.

In context of CGA->EGA->VGA progression, yes, the EGA was not all that great. But my point was - if EGA was released 3 years earlier instead of CGA the story would be different. Keep in mind the EGA could re-use CGA monitors, I'm sure some compatibility decisions affected the design. It's almost funny how many more programs there were for CGA then EGA - or at least I remember some PCB routing software that was done in CGA. Later I also had a, uh, "trial" copy of OrCAD but that was already VGA (if not SVGA) and I can't remember anymore if there were EGA drivers included or not.

My guess is IBM people completly did not understand the concept of (personal , after all) computers being used for leisure by adults - games mostly. So they never looked at video output from that angle (which also meant poor overall performance). I mean sure, the text modes of the "enchanced CGAs" (unless in MDA emulation, but that's a different monitor) were still crap, but the 16 colors and 640x400 resolution really make all the difference. And it's still CGA, just more VRAM and one more bitplane available. It could have, and should have, been done by IBM from the start.

Frankly IBM dropped the ball again with MCGA, which as the name suggests was another CGA revival but done right - except of course way too late and it went against the superior VGA. Eh, I could point out what I belive to be VGA shortcomings too but at least clones popped up and then SVGA, and then we got VESA extensions to clean the compatibility mess and all was good. Sorry for the OT by the way.

Reply 82 of 84, by douglar

User metadata
Rank Oldbie
Rank
Oldbie
Deunan wrote on 2021-05-11, 09:59:

In context of CGA->EGA->VGA progression, yes, the EGA was not all that great. But my point was - if EGA was released 3 years earlier instead of CGA the story would be different. Keep in mind the EGA could re-use CGA monitors, I'm sure some compatibility decisions affected the design. It's almost funny how many more programs there were for CGA then EGA - or at least I remember some PCB routing software that was done in CGA. Later I also had a, uh, "trial" copy of OrCAD but that was already VGA (if not SVGA) and I can't remember anymore if there were EGA drivers included or not.

I think that's fair. Monochrome character mode was a legit improvement over home computers of the day that used TV screens, but CGA was not. It was up against commodity units like the NES and C64 and compared poorly.

Consider Wizard and Princess: https://en.wikipedia.org/wiki/Wizard_and_the_Princess
The Commodore 64 version uses more on-screen colors and solid colors and no dithering. The items in the game were more detailed and of higher resolution

While I'm sure there are other opinions out there, my opinion is that PC games were more "entertaining curiosities" in the CGA era, kind of like: "Look at that bear dancing in a tu-tu" sort of way. A dancing bear is an inferior dancer, no doubt about it, but the exciting part was that it is a bear trying to dance that keeps you watching. Once EGA came along, there was some separation in capabilities that made the PC interesting for gaming, especially for graphical adventure games that could really shine when paired with a hard drive, large amounts of ram (640K!!!) and high res graphics.

Reply 83 of 84, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

I had a 5170, I don't miss it. 6mhz snail. I agree with the people who say a Turbo XT can feel faster and my basis of comparison was an Amstrad 1640 which i had at the same time. Games I tried were a slideshow. Anything that ran fast enough not to be frustrating, also ran just as fast on a 5160. It's the ideal second machine if you have a 5160 though, you can do stuff on one while you're waiting for stuff to load or complete on the other. Now games, I just noticed a couple on the shelf, SimEarth and Frederick Pohl's Gateway that have 10Mhz recommended on the specs. I think that was the only real step up between "runs on bog stock PC/XT" and "386 required". Test Drive II and Wolf3D will say they run on lower 286es, but until you get them on about 16Mhz, they are not a great experience. In theory the 286 was faster than the 086 or 088, but that was when all features were working, you know what you call a fully realised 286 that has all enhancements performing as designed? a 386. There was that really gnarly multiply instruction I think it was on the 86 and 88 that took 10 cycles or something extreme, that got fixed in the 186, 188, v20, v30 etc that many of the XT clone machines have, so in removing that particular method of the CPU shooting itself in the foot, they gained almost as much as the 286 did on generic 86 code. However, a lot of it was about implementation, and with well designed motherboards and BIOSes you could get clone 286es that were much more optimised than IBMs. I too have noted that on faster 286es, standard mode in windows can feel real snappy. I had my first web browsing experience on a 286/16... running Win 3.x in standard mode, mosaic 0.8 or some other early beta ran on it. Wasn't a lot of images to load then. The thing was, that tolerances and timings were probably baked into 286 boards to allow for 16 and 20mhz chips, so they run suboptimally for the slower ones, the same for 386 boards, SX16s just feel lethargic, you get the snap back at about 25-33.

Anyway, as far as games I've experienced go, there seems to be no compelling reason to have a 286 slower than about 12Mhz... if you've got a turbo XT.. if you haven't got a Turbo XT and a 10Mhz falls in your lap, take it... if you've got a 5160, stick a V20 in it. You can also get two of the later "generic standard" type desktop format cases in the space a 5170 takes up. If you really gotta have that "where it all came from" machine for the AT class, or you're an IBM nerd, the 5170 is the 5170 nothing else is, but it's not really a gap filler as far as game performance goes in my opinion.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 84 of 84, by Eep386

User metadata
Rank Member
Rank
Member

MCGA (memory controller gate array, or multicolor graphics adapter, depending on who you ask) was a deliberate cripple of VGA, essentially VGA with all EGA support stripped out and only the most bare-bones support for Mode 13h and a 640x480 monochrome mode remaining from VGA. IBM introduced both standards with their PS/2 line. MCGA was intended to be an 'entry-level' graphics option, slotting under the mid-range VGA option, which itself slotted under the ultra high-end XGA or 8514 options. Or something to that effect. 😜

IBM didn't really 'drop the ball' with MCGA, they knew exactly what they were doing. It just sort of didn't pay off in the end, as non-IBM designed VGA compatible chips were relatively quick in coming that even offered better performance than IBM's originals. But then the PS/2 line as a whole was a bit of a failed gambit in the consumer space, though its legacy was felt for many years (PS/2 keyboard and mouse ports, the familiar VGA monitor port and (S)VGA itself, etc).

The 8086-fitted PS/2 Model 25 and 30 featured MCGA, whilst the 286-fitted versions (25-286 and 30-286 respectively) featured VGA. Thankfully you can use the same monitors with either standard, though the Model 25 twins had built-in monitors.

Life isn't long enough to re-enable every hidden option in every BIOS on every board... 🙁