bakemono wrote on 2023-04-28, 17:07:
Look what they did on MCGA. It has a 64KB framebuffer and they gave you 640x480 monochrome which only uses 37.5KB. If not for the sake of pixel aspect, surely 640x400 with 4 colors would have been better??
The gameboy was fine with 4 shades of gray. 😉
Ok, just kidding.
For GUIs, yes, a lot, I think. The Olivetti M24 had a 640x400 pixels monochrome mode that was a far cry from CGA's 640x200 mono mode.
Windows 3.0 looked beautiful on it compared to CGA's 640x200 (b/w). Text was much clearer, too.
It was possible because the M24 had a real monitor, not a glorified TV set.
If I had to decide between running Windows 3.0 or GEM in 4 shades of gray in 640x400 and running it black/white in 640x480, I might have opted for 640x400 and sacrifice those 80 lines.
Because, the grayscales would avoid a lot of using dithering patterns (in theory, Windows doesn't support 4 colour modes).
The letters could use a intermediate gray pixel for smoothing. Like "Clear Type" on modern Windows.
Originally, the main culprit for the low line count was the fake progressive mode
used in order to support those 15KHz colour monitors (TV sets), I think.
These TV sets / video monitors support 500 to 600 lines (professional monochrome types up to 1000),
but merely interlaced (using odd/even lines).
In order to simulate progressive scan,
home computers and the CGA card simply used merely one of them. Odd or even, not both.
That causes the resolution to be halved, because one of the line types is "dead".
So we end up with those lumpy ~200 lines. And visible scan lines.
Gratefully, VGA was not like that anymore and departed from the old TV standards a bit.
Standard VGA uses 640x480 pels at 60Hz/31,5 KHz in progressive scan.
That's why it can do display 320x200 pels, as well as 320x400 pels - by disabling line doubling feature.
Line doubling is turned on by default because VGA uses progressive scan rather than interlacing.
The 320x200 pels resolution of MCGA mode 13h his being automatically doubled to 320x400 in reality.
Which results in the 200 lines of picture data to be dublicated, which in turn removes visible scan lines (good).
- If the feature is turned off, though, applications can theoretical fully use those 400 lines
for real picture information rather than duplicates.
bakemono wrote on 2023-04-28, 17:07:
Of course there are other resolutions that could maintain square pixels on a 4:3 monitor and still fit in a power-of-two size framebuffer. 288x216, 416x312, 576x432, 832x624, etc. (1152x864 was a nice desktop res on 16-17" CRTs)
That's true, though on real CRTs, the pixel form wasn't that important, maybe? 🤷♂️
I mean it kind of was, because a low resolution looks blocky and even more blocky, if it's non-square, too. Like 320x200 pels.
But if the image quality was somewhat poor on a proper square-resolution already (320x240 pels),
did it matter anymore if a higher, but non-square resolution was used (320x400 or 640x400) ?
Because, the monitor's tube was in a 4:3 form factor all the time (5:4 existed, but was less common).
So there was no need to figuring out geometry by doing math, it always was 4:3 no matter the digital aspect ratio/the source information.
The artists could thus depend on it, it was set into stone on the physical side.
PC Users had the ability to manually stretch the picture on the monitor knobs to fill that image on a picture tube, also.
bakemono wrote on 2023-04-28, 17:07:
As for VGA, they should have opened up the whole 0xA0000 to 0xBFFFF range for gfx so we don't need to resort to bank switching just to do 320x240...
That's a good idea as such!
Because, no matter the technical explanations for the limits,
graphics fidelity needs a minimum amount of colour/resolution.
And those 64KB respective 200 lines just didn't cut it, period. 😣
Even minimally tweaked VGA modes in the form of 360x240 or 320x360 et cetera were a dramatic improvement in picture clearity, I think.
Hm. I suppose the 64KB limit was originally chosen because of the 8086 segment size?
Using other memory models certainly was possible, though. DOS compilers had used workarounds for such things (different memory models).
Or, as an analogy: EMS 4 nolonger required a 64KB window with 16KB pages each, for example.
Instead, EMS applications accessed 256KB at once.
So it must have been possible to use a bigger framebuffer, despite the x86 segment limits.
Or what if VGA was using B to C segment, to allow for 704KB of DOS conventional memory? 🙂
In 1987 (VGA year of release), conventional memory already was getting scarce.
That's one of the things I valued about CGA and Hercules, by the way, they didn't use A segment.
The engineers of CGA were extra wise by choosing a greater distance to A segment, which makes 736KB of conventional memory possible. OS/2 2.x officially supported it for its DOS VMs.
Edit: I was merely thinking out loud when I wrote these lines (pun intended!) .. It's not meant as a critique or something along these lines.
Edit: Found this interesting article about VGA's internals.
https://www.phatcode.net/res/224/files/html/ch31/31-01.html
"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel
//My video channel//