VOGONS


First post, by Cyberdyne

User metadata
Rank Oldbie
Rank
Oldbie

So the memory is plentiful for that resolution, and the text mode uses that effective resolution. But all mainstream stuff used 640 in bitmap mode.

It just has bothered me. I know that BIOS does not have this resolution, but it is so easy to achieve.

I am aroused about any X86 motherboard that has full functional ISA slot. I think i have problem. Not really into that original (Turbo) XT,286,386 and CGA/EGA stuff. So just a DOS nut.
PS. If I upload RAR, it is a 16-bit DOS RAR Version 2.50.

Reply 1 of 9, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie

640/480 = 4/3 = square pixel

Also, are you sure that 720x480 is easy to achieve on early VGAs?
They were affected by severe bandwidth limitations, discussed in-depth here - VGA cards with little RAM (256KB): Why were they made in first place

Nie tylko, jak widzicie, w tym trudność, że nie zdołacie wejść na moją górę, lecz i w tym, że ja do was cały zejść nie mogę, gdyż schodząc, gubię po drodze to, co miałem donieść.

Reply 2 of 9, by vstrakh

User metadata
Rank Member
Rank
Member

Yeah, the memory bandwidth.
720 is used in text modes, when the pixels are the font bytes, using just one bit per pixel, as opposed to 4 bits per pixel used in 16-color modes, where you need to transfer x4 more data.

But still, 720 pixels would need "just" 12.5% higher bandwidth than the 640 pixels mode. Was it already at the limit of memory chips in IBM card, even with 640 pixels?

Reply 3 of 9, by wbc

User metadata
Rank Member
Rank
Member

actually 720x480 16 colors is possible on plain VGA, and shares its CRTC parameters with 360x480 Mode-X (see examples in Tweak 1.6b). On the other hand, it does not have any actual advatages, has non-square pixel aspect ratio on 4:3 displays, and most LCD displays will sample it as 640x480, resulting in jammed pixels 😜

Grzyb wrote on 2023-04-26, 12:52:

They were affected by severe bandwidth limitations, discussed in-depth here - VGA cards with little RAM (256KB): Why were they made in first place

if I recall correctly, on the original IBM VGA implementation, memory controller is clocked from VGA pixel clock, 720-wide modes use 28MHz clock, resulting in a slightly shorter memory access cycles compared with 320/640-wide modes. If memory chips can tolerate this (and they certainly do, else 360-wide Mode-X won't work, and probably textmode as well), there is nothing to prevent 720x480 mode from running without glitches.

--wbcbz7

Reply 4 of 9, by mkarcher

User metadata
Rank l33t
Rank
l33t
Grzyb wrote on 2023-04-26, 12:52:

Also, are you sure that 720x480 is easy to achieve on early VGAs?

Yes, it's easy to achieve. You just need to re-program the horizontal timings in 640x480 graphics mode. It's straightforward and was well-known to graphics card gurus in the early 90s. I got the book "Die Programmierung der EGA/VGA-Karte" by Matthias Uphoff in 1992, and it already had example source code how to enter that mode. This book also mentioned that you could use the 720x480 frame for text modes to get a 90x30 text mode screen when you downgrade from 9x16 to 8x16 pixels, or 90x60 when you load the 8x8 character font that is also used for the standard 80x50 text mode (in a 9x8 character box). The author further mentioned that you could design an 8x6 pixel font to get to record-breaking 90x80 characters on a standard VGA card, but highly suggested to supply a magnifying glass in the box of any software actually using this capability.

The reason that this mode is not provided by default is likely that a lot of EGA graphics libraries had the line length of 80 pixels hardcoded and could work out of the box on VGA cards (possibly adjusting the maximum Y clamping, which is straightforward if the library already supports EGA 640x200 and EGA 640x350), so going to 720x480 pixels would provide marginal benefit at the expense of being less accessible to software and having non-square pixels, which is something IBM finally got rid of with the VGA card.

If I had a wish for an extra BIOS-supported VGA mode in 1990, it wouldn't have been 720x480 (although that one wouldn't hurt, of course), but 640x400, because you could do double buffering in it (memory is just sufficient) and it would be a higher resolution 70Hz mode than 640x350.

Reply 5 of 9, by Jo22

User metadata
Rank l33t++
Rank
l33t++
mkarcher wrote on 2023-04-26, 17:43:

If I had a wish for an extra BIOS-supported VGA mode in 1990, it wouldn't have been 720x480 (although that one wouldn't hurt, of course), but 640x400, because you could do double buffering in it (memory is just sufficient) and it would be a higher resolution 70Hz mode than 640x350.

640x400 in 16c was a popular video mode on the PC-98 platform, I remember.
On western PCs, VBE BIOS supports mode 100h at 640x400 pels in 256c.
It works with VGA video memory as low as 256KB.
The point&click game "Die Höhlenwelt Saga" uses it, too.

Edit: SVGA modes may not work on early ISA Super VGA boards, if the DIP switches aren't configured for NEC or Multisync monitors (by default they aren't).
I had encountered this issue in the past myself.

Edit: I was just thinking out loud.. Speaking under correction, of course.
I'm not that familiar with VBE also, since back in the day, my ATI VGA Wonder had no VBE BIOS in ROM.
There were VBE 1.x TSRs, of course, but I had got my 286 PC second hand without any driver diskettes.
Of course, the many shareware CDs had them hidden somewhere,
but I had no real need to run them each time and loose precious RAM.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 6 of 9, by VileR

User metadata
Rank l33t
Rank
l33t

It always seemed to me that 720x480 (or well, 360x240) would provide better emulation of 200-line CGA/EGA modes, if you centered the active area within the slightly larger frame:

- Vertical refresh rate would be 60 Hz, the same as 'native' 200-line (15.7KHz) modes
- You'd have a more authentic amount of overscan - closer to what these modes have on CGA/EGA, as opposed to VGA's puny hair-thin border area
- Pixel aspect ratio would still be pretty close to the original - 8:9 (0.888..) rather than 5:6 (0.8333..)

A reasonably compatible TSR to do that shouldn't be rocket science... if VGATV is a thing, this one would probably be easier.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 7 of 9, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Hm. Now that I think of it, 720x480 pels is one of the higher resolution of digital NTSC (DVD, 1:09 ratio?).

It's digital PAL pendant is 720x576 pels with an 1:1,07 ratio. Full PAL would be even higher, 768×576 (1:1).

Too bad VGA was kind of based on American NTSC (640x480 pels) again.
Years before, IBM was on the right track when it used the European 50Hz timings for its MDA.. *sigh*

Image if VGA had introduced a default resolution of 768×576 rather than NTSC's 640x480! The (PC) world would have been a better place. 🙂

Edit: I mean, let's imagine DOS games back in the day had been running at 360x288 or 384x288 pixels instead of 320x240 pixels (Mode X)!
That would have been fantastic! Much less pixelation! 😃

Edited.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 8 of 9, by bakemono

User metadata
Rank Oldbie
Rank
Oldbie

Hercules used 720x348 so it's not like 720 pixels was unheard of. I tend to think they favored 640x480 for the sake of square pixels. Look what they did on MCGA. It has a 64KB framebuffer and they gave you 640x480 monochrome which only uses 37.5KB. If not for the sake of pixel aspect, surely 640x400 with 4 colors would have been better??

Of course there are other resolutions that could maintain square pixels on a 4:3 monitor and still fit in a power-of-two size framebuffer. 288x216, 416x312, 576x432, 832x624, etc. (1152x864 was a nice desktop res on 16-17" CRTs)

As for VGA, they should have opened up the whole 0xA0000 to 0xBFFFF range for gfx so we don't need to resort to bank switching just to do 320x240...

again another retro game on itch: https://90soft90.itch.io/shmup-salad

Reply 9 of 9, by Jo22

User metadata
Rank l33t++
Rank
l33t++
bakemono wrote on 2023-04-28, 17:07:

Look what they did on MCGA. It has a 64KB framebuffer and they gave you 640x480 monochrome which only uses 37.5KB. If not for the sake of pixel aspect, surely 640x400 with 4 colors would have been better??

The gameboy was fine with 4 shades of gray. 😉
Ok, just kidding.

For GUIs, yes, a lot, I think. The Olivetti M24 had a 640x400 pixels monochrome mode that was a far cry from CGA's 640x200 mono mode.
Windows 3.0 looked beautiful on it compared to CGA's 640x200 (b/w). Text was much clearer, too.
It was possible because the M24 had a real monitor, not a glorified TV set.

If I had to decide between running Windows 3.0 or GEM in 4 shades of gray in 640x400 and running it black/white in 640x480, I might have opted for 640x400 and sacrifice those 80 lines.

Because, the grayscales would avoid a lot of using dithering patterns (in theory, Windows doesn't support 4 colour modes).
The letters could use a intermediate gray pixel for smoothing. Like "Clear Type" on modern Windows.

Originally, the main culprit for the low line count was the fake progressive mode
used in order to support those 15KHz colour monitors (TV sets), I think.
These TV sets / video monitors support 500 to 600 lines (professional monochrome types up to 1000),
but merely interlaced (using odd/even lines).

In order to simulate progressive scan,
home computers and the CGA card simply used merely one of them. Odd or even, not both.

That causes the resolution to be halved, because one of the line types is "dead".
So we end up with those lumpy ~200 lines. And visible scan lines.

Gratefully, VGA was not like that anymore and departed from the old TV standards a bit.
Standard VGA uses 640x480 pels at 60Hz/31,5 KHz in progressive scan.
That's why it can do display 320x200 pels, as well as 320x400 pels - by disabling line doubling feature.

Line doubling is turned on by default because VGA uses progressive scan rather than interlacing.
The 320x200 pels resolution of MCGA mode 13h his being automatically doubled to 320x400 in reality.
Which results in the 200 lines of picture data to be dublicated, which in turn removes visible scan lines (good).
- If the feature is turned off, though, applications can theoretical fully use those 400 lines
for real picture information rather than duplicates.

bakemono wrote on 2023-04-28, 17:07:

Of course there are other resolutions that could maintain square pixels on a 4:3 monitor and still fit in a power-of-two size framebuffer. 288x216, 416x312, 576x432, 832x624, etc. (1152x864 was a nice desktop res on 16-17" CRTs)

That's true, though on real CRTs, the pixel form wasn't that important, maybe? 🤷‍♂️

I mean it kind of was, because a low resolution looks blocky and even more blocky, if it's non-square, too. Like 320x200 pels.

But if the image quality was somewhat poor on a proper square-resolution already (320x240 pels),
did it matter anymore if a higher, but non-square resolution was used (320x400 or 640x400) ?

Because, the monitor's tube was in a 4:3 form factor all the time (5:4 existed, but was less common).
So there was no need to figuring out geometry by doing math, it always was 4:3 no matter the digital aspect ratio/the source information.

The artists could thus depend on it, it was set into stone on the physical side.
PC Users had the ability to manually stretch the picture on the monitor knobs to fill that image on a picture tube, also.

bakemono wrote on 2023-04-28, 17:07:

As for VGA, they should have opened up the whole 0xA0000 to 0xBFFFF range for gfx so we don't need to resort to bank switching just to do 320x240...

That's a good idea as such!
Because, no matter the technical explanations for the limits,
graphics fidelity needs a minimum amount of colour/resolution.
And those 64KB respective 200 lines just didn't cut it, period. 😣

Even minimally tweaked VGA modes in the form of 360x240 or 320x360 et cetera were a dramatic improvement in picture clearity, I think.

Hm. I suppose the 64KB limit was originally chosen because of the 8086 segment size?
Using other memory models certainly was possible, though. DOS compilers had used workarounds for such things (different memory models).

Or, as an analogy: EMS 4 nolonger required a 64KB window with 16KB pages each, for example.
Instead, EMS applications accessed 256KB at once.
So it must have been possible to use a bigger framebuffer, despite the x86 segment limits.

Or what if VGA was using B to C segment, to allow for 704KB of DOS conventional memory? 🙂
In 1987 (VGA year of release), conventional memory already was getting scarce.

That's one of the things I valued about CGA and Hercules, by the way, they didn't use A segment.
The engineers of CGA were extra wise by choosing a greater distance to A segment, which makes 736KB of conventional memory possible. OS/2 2.x officially supported it for its DOS VMs.

Edit: I was merely thinking out loud when I wrote these lines (pun intended!) .. It's not meant as a critique or something along these lines.

Edit: Found this interesting article about VGA's internals.
https://www.phatcode.net/res/224/files/html/ch31/31-01.html

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//