VOGONS


First post, by Rincewind42

User metadata
Rank Member
Rank
Member

My understanding is that CGA and EGA monitors are dead simple devices in the sense that they can't distinguish between 320x200 and 640x200 graphics modes at the input signal level. From the "monitors point of view", there is no difference—320x200 is simply "pixel-doubled" by the CGA/EGA card.

I'm not an analog electronics person, but this is explained in more technical detail in this this comment (source: https://news.ycombinator.com/item?id=20207122);

For both 320x200 and 640x200, HSYNC ran at 15.75 kHz, VSYNC ran at 60 Hz. When you go from 320x200 to 640x200, all that happens is the pixel clock (the rate at which you read out from RAM) is doubled, so you get exactly two pixels horizontally packed in where there used to be one pixel. The older hardware, like EGA video cards, can only generate one other HSYNC speed: 21.8 kHz, for special 350-line modes. When VGA came out, it doubled the HSYNC frequency and, for these modes, would just read each row out twice.

My question: are the above statements correct? If yes, it follows that 320x200 graphics and the same graphics pixel-doubled to 640x200 and then displayed in a 640x200 mode should look exactly identical on a CGA/EGA monitor (because they're identical at the digital signal level).

I'm asking because this has implications for developing an authentic CGA/EGA CRT shader (i.e. 320 pixel wide modes need to be always pixel-doubled; the shader must always "see" 640 pixels per line as that makes a quite noticeable difference on the sharpness of the resulting image).

DOS: Soyo SY-5TF, MMX 200, 128MB, S3 Virge DX, ESS 1868F, AWE32, QWave, S2, McFly, SC-55, MU80, MP32L
Win98: Gigabyte K8VM800M, Athlon64 3200+, 512MB, Matrox G400, SB Live
WinXP: Gigabyte P31-DS3L, C2D 2.33 GHz, 2GB, GT 430, Audigy 4

Reply 1 of 38, by Tiido

User metadata
Rank l33t
Rank
l33t

These statements are not incorrect.

Pixel doubled 320 will look same as 640 pixel line on the monitor. The digital nature of the connection is entirely lost to the monitor and only has meaning inside the video card in its image generation pipeline. There is no pixel clock or anything and the RGBI output is simply seen as a continuous signal to the monitor without any discrete aspects like modern connections provide.

VGA line doubles the 200 line modes into 400 lines, but with 70Hz instead of 60Hz, 350 line mode is fit into 400 with top and bottom borders, again at 70Hz. For 60Hz the line count must increase to 480, which stems from fixed line freq (double of NTSC, 31.something kHz) on the monitor that cannot change (unless it is a multisync montor).

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 2 of 38, by Rincewind42

User metadata
Rank Member
Rank
Member

Yep, I'm familiar with the scan-doubling and pixel-doubling behaviour of VGA; you really need to send 640x400 to the shader for the 320x200 13h screen mode, 640x480 for the 320x240 tweaked mode, 800x600 for the 400x300 VESA mode, and so on, for the output of the CRT shader to replicate the look you get on a real monitor.

Just wasn't sure about the CGA/EGA behaviour, but it all makes sense then. On CGA we have 640x400 then (320x200 is pixel-doubled), and the same deal on EGA (plus we also have 640x350 on EGA of course). So I'll just need to make sure to always send 640x200 to the shader in the 200-line CGA/EGA modes.

Thanks!

DOS: Soyo SY-5TF, MMX 200, 128MB, S3 Virge DX, ESS 1868F, AWE32, QWave, S2, McFly, SC-55, MU80, MP32L
Win98: Gigabyte K8VM800M, Athlon64 3200+, 512MB, Matrox G400, SB Live
WinXP: Gigabyte P31-DS3L, C2D 2.33 GHz, 2GB, GT 430, Audigy 4

Reply 3 of 38, by Scali

User metadata
Rank l33t
Rank
l33t

Technically there *is* no horizontal resolution. It's just a continuous signal, only the start of a new scanline is specifically marked with a hsync pulse.
The resolution is dictated by the display hardware. The monitor merely needs to have a dot pitch that is small enough to display ~640 pixels (or more), else you won't actually be able to see the pixels in full detail.

On the Amiga with ECS chipset this was actually exploited... They offered video modes of up to 1280 pixels horizontally, which you could display on any standard monitor or TV, as the timings were still standard PAL or NTSC.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 4 of 38, by Rincewind42

User metadata
Rank Member
Rank
Member
Scali wrote on 2023-06-26, 06:54:

Technically there *is* no horizontal resolution. It's just a continuous signal, only the start of a new scanline is specifically marked with a hsync pulse.
The resolution is dictated by the display hardware. The monitor merely needs to have a dot pitch that is small enough to display ~640 pixels (or more), else you won't actually be able to see the pixels in full detail.

Yep, fully get that. Where I'm coming from is this: a 320 pixel wide image in a 320 pixel mode, or the same 320 pixel image displayed in a 640 pixel wide mode (but resized horizontally by 200% using nearest-neighbour first) is indistinguishable at the *signal* level that goes to the monitor.

Therefore, to replicate the same behaviour in a shader (so the two images are indistinguishable from each other), we must always send 640 pixel wide framebuffers to the shader.

JPJrdsn.png

The top image is sending 320x200 through a CRT shader, the bottom image is with 640x200. I was a bit unhappy with my "EGA CRT shader" attempts because of the horizontal blurriness—I did not notice such blurriness on actual photos of EGA monitors, "pixels" on EGA monitors appear quite blocky horizontally, yet you have visible scanlines. It turns out if you pixel-double the 320x200 before sending it to the shader, the results are close to what I can see on these real CRT photos.

Of course, you could always compensate in the shader itself to add some horizontal sharpness, and the whole shader business is just a crude approximation of all the things that go on in an actual CRT, I'm fully aware of that. But this gets me a little bit closer, I think. The "emulated CRT monitor" should always only see 640-pixel lines.

Scali wrote on 2023-06-26, 06:54:

On the Amiga with ECS chipset this was actually exploited... They offered video modes of up to 1280 pixels horizontally, which you could display on any standard monitor or TV, as the timings were still standard PAL or NTSC.

Great to see another fellow Amiga enthusiast here 😎 🤘🏻 Nice blog too, I'm gonna read those polygon filling posts of yours for a bit of Amiga/demoscene nostaliga.

DOS: Soyo SY-5TF, MMX 200, 128MB, S3 Virge DX, ESS 1868F, AWE32, QWave, S2, McFly, SC-55, MU80, MP32L
Win98: Gigabyte K8VM800M, Athlon64 3200+, 512MB, Matrox G400, SB Live
WinXP: Gigabyte P31-DS3L, C2D 2.33 GHz, 2GB, GT 430, Audigy 4

Reply 5 of 38, by Scali

User metadata
Rank l33t
Rank
l33t
Rincewind42 wrote on 2023-06-26, 07:17:

Therefore, to replicate the same behaviour in a shader (so the two images are indistinguishable from each other), we must always send 640 pixel wide framebuffers to the shader.

Yes exactly, you're basically emulating the dot pitch that way. Or in other words: you're emulating how the monitor converts that continuous scanline signal into individual pixels on screen.
The dot pitch is of course a fixed property of the monitor, it doesn't change with resolution.
So you indeed need to always apply the shader at the higher resolution, in this case 640 pixels.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 6 of 38, by Rincewind42

User metadata
Rank Member
Rank
Member
Scali wrote on 2023-06-26, 07:42:
Yes exactly, you're basically emulating the dot pitch that way. Or in other words: you're emulating how the monitor converts tha […]
Show full quote
Rincewind42 wrote on 2023-06-26, 07:17:

Therefore, to replicate the same behaviour in a shader (so the two images are indistinguishable from each other), we must always send 640 pixel wide framebuffers to the shader.

Yes exactly, you're basically emulating the dot pitch that way. Or in other words: you're emulating how the monitor converts that continuous scanline signal into individual pixels on screen.
The dot pitch is of course a fixed property of the monitor, it doesn't change with resolution.
So you indeed need to always apply the shader at the higher resolution, in this case 640 pixels.

Thanks for confirming, looks like I'm on the right track then!

DOS: Soyo SY-5TF, MMX 200, 128MB, S3 Virge DX, ESS 1868F, AWE32, QWave, S2, McFly, SC-55, MU80, MP32L
Win98: Gigabyte K8VM800M, Athlon64 3200+, 512MB, Matrox G400, SB Live
WinXP: Gigabyte P31-DS3L, C2D 2.33 GHz, 2GB, GT 430, Audigy 4

Reply 7 of 38, by mothergoose729

User metadata
Rank Oldbie
Rank
Oldbie

The pixel clock is the amount of data streamed to the display. A CRT draws in scanlines, but within a scanline the CRT gun can modulate color and brightness to paint the image. If you are sending useful horizontal data then you do actually get more detail.

Doubling the pixel clock will also increase your horizontal sync frequency, which is why VGA and EGA cards use scan doubling - its how you pad a low resolution signal so that it matches the display timings of a high resolution display.

Reply 8 of 38, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Or in other words, a TV or video monitor draws in lines. Pixels don't exist to the monitor.
It's just a continuous stream (or beam) of varying intensity within per line.
That's why monochrome monitors/TVs with a single Braun tube are so clean.
No mask, three times the resolution/precision of an RGB system per line (no sub pixels).
Edited.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 9 of 38, by cinnabar

User metadata
Rank Newbie
Rank
Newbie

The CGA does not 'pixel double' anything. It sends pixels to the display at a rate typically called the 'dot clock' - in the high res 640x200 mode it is sending pixels at twice the rate, so twice as many appear on a scanline. In the low res mode it is sending pixels at half the high res rate. No 'doubling' of anything is going on.

Reply 10 of 38, by Jo22

User metadata
Rank l33t++
Rank
l33t++
mothergoose729 wrote on 2023-06-26, 08:22:
The pixel clock is the amount of data streamed to the display. A CRT draws in scanlines, but within a scanline the CRT gun can […]
Show full quote

The pixel clock is the amount of data streamed to the display.
A CRT draws in scanlines, but within a scanline the CRT gun can modulate color and brightness to paint the image.
If you are sending useful horizontal data then you do actually get more detail.

Doubling the pixel clock will also increase your horizontal sync frequency, which is why VGA and EGA cards use scan doubling
- its how you pad a low resolution signal so that it matches the display timings of a high resolution display.

I think the same. Merely VGA cards do an internal doubling of the low resolution modes (320x200 -> 320x400)*. And real progressive scan (all lines of the video standard drawn in order).
With line doubling enabled, the card essentially draws each line twice, which provides a higher quality picture (better readability) and makes things work with a single frequency (~31KHz).
- Unless being provided by useful pixel information. Some 90s era demo scene productions disabled line doubling and used 400 individual lines with actual graphics.

That being said, perhaps it's important to keep in mind that VGA was designed for 640x480 resolution and business applications in first place.
Mode 12h (640x480 pels 16c) at 60Hz is the real, primary VGA resolution mode. Like mode 10h (640x350 16c 60 Hz) is for EGA.

The also equally important VGA text-mode uses 720×400 pels (at 70 Hz),
but all in all looks similar to the VGA monitor (80 pixels moved from right to left).

CGA cards, by contrast, always* use 200 lines. That's all they can do.
Otherwise, they'd be Hercules compatibles. 😉 - Because, both use the same text character generator chip.

The main difference with CGA is, that it uses fake progressive scan.
Instead of properly displaying all lines of its video standard (NTSC with interlacing in this case), it just uses either odd or even fields.
It leaves one of them blank, essentially. The video monitor/TV still thinks it's getting an interlaced signal, but half of it is dead.

This kind of signal is known as low-definition signal, LD, and the source of all kinds of problems nowadays.
It's the reason why modern TVs do have trouble displaying 240p/288p signals, too.
LD is not a broadcast-safe signal, either : https://en.wikipedia.org/wiki/Broadcast-safe

Edit: I'm sorry, I've messed up the quotes. 😅

cinnabar wrote on 2023-06-26, 16:52:

The CGA does not 'pixel double' anything. It sends pixels to the display at a rate typically called the 'dot clock'
- in the high res 640x200 mode it is sending pixels at twice the rate, so twice as many appear on a scanline.
In the low res mode it is sending pixels at half the high res rate. No 'doubling' of anything is going on.

Right. Isn't that also the reason for CGA snow and other weird effects ?
I would imagine the Motorola CRTC has a lot of work to do to display 640x200 graphics mode / 80x25 text-mode.
Those "hi-res" modes are quite demanding for the poor little thing.
Edit: Oh my poor wording again.. I meant bandwidth, as such.
The infamous CGA snow has a bandwidth problem, too, but is more related to XT architecture/RAM interface of course (dual-ported RAM or SRAM do work wonders).

(*Edit: In reality, there were some propietary CGA compatibles ("Super CGA") with 640x400 resolution, allowing better text.
Like the on-board video of the Olivetti M24.. But that weren't graphics cards in the usual sense. Not dedicated ones for ISA bus.
They required a custom monitor at the time, which was no longer based on NTSC timings.)

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 11 of 38, by cinnabar

User metadata
Rank Newbie
Rank
Newbie

The VGA only has a 25Mhz and a 28Mhz physical clock chips on board, so this tends to dictate a lot of the decisions they made regarding keeping backward compatible low-res modes on newer higher frequency displays. With the CGA all the timing is coming straight from the system board.

I didn't know about the 'fake' progressive scan, that's interesting. Any idea where I can find more out about this?

The CGA snow happens only in 80 char Alphanumeric mode where the video RAM is under most stress since its in hi-res and having to fetch an attribute byte for every character fetched. The 6845 itself is not under any 'stress' its running at 1.79Mhz in 'hi-res' and the part is rate for 2Mhz.

The CPU and Video Refresh are contending for RAM and the CGA is designed such that the CPU always 'wins' so that data is not corrupted during read or write. The refresh is still happening though, its just that the address and data lines are being driven by the CPU, so the character ROM and attribute handler ends up seeing 'garbage' data which was whatever the CPU was reading or writing from/into VRAM at the time.

The EGA is much more orderly, it has its 'sequencer' which allocates slots for CPU and Video Refresh. In the case of the EGA and onwards, the CPU must wait for access, the display has priority.

Reply 12 of 38, by cinnabar

User metadata
Rank Newbie
Rank
Newbie

CGA cards, by contrast, always* use 200 lines. That's all they can do.
Otherwise, they'd be Hercules compatibles. 😉 - Because, both use the same text character generator chip.

This is what fascinates me about the CGA on the IBM PC. The CGA and the MDA arrived at more or less the same time, but there is a little archaeological evidence that the CGA was really 'first' by a little way, though the two projects were likely running side by side. We all think of the IBM now as a business focused machine but it feels like there were two competing forces at the time. It's hard to ignore that the original 5150 had a tiny amount of RAM and had a Cassette BASIC. But when you realize they clocked the master clock of the system to be 4*NTSC colorburst, and that this base frequency is where we get our magic 4.77Mhz for the CPU from, you see that NTSC was *baked in* to the design. Ability to display on inexpensive monitors or televisions was key to the project. The 200 lines is also baked in as a result.

I think MDA came slightly after, since if you read the documentation from IBM on the MDA they talk about the attribute bits referencing RGBI - which is supposedly non existent on an MDA, and of course it interprets its attribute byte differently than as in color. And then when you look really deep into it, (see John Elliot's page), there is unimplemented hardware, for instance the MDA has a 'high resolution' bit that must be set and traces leading to dead ends that suggests a low resolution mode had been planned, which would likely have gone along with color, and even that some early MDAs even supported color output pins.

My take is that the CGA was the 'original gangster' for the IBM PC.

https://www.seasip.info/VintagePC/mda.html

Reply 13 of 38, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie

There was a lot of unused stuff left in MDA, some of which may have come from ibms earlier 5100 system.

If I had to guess it would have been a more fully featured card with some sort of low res monochrome graphics mode and basic support for an unreleased high resolution color screen (text only)

It’s possible IBMs brass cut off the program to focus the business line to strictly monochrome text feeling the other features muddled the programs intent

It’s also possible the text only adapter was meant to display text on whatever screen you had available, color or mono.

Reply 14 of 38, by VileR

User metadata
Rank l33t
Rank
l33t

There's nothing "fake" about CGA's progressive scan. It's non-interlaced, hence progressive. You may point out that it's half the resolution you'd get from 'standard' NTSC progressive scan, but then again it's also double the frame rate. And progressive-scan NTSC was certainly not 'standard' at the CGA's introduction either way.

@Rincewind42: others have already answered your original question, but nice work on the CRT shader!
Although since you were saying 'authentic', I should point out that the phosphor dot pitch appears a lot finer than the typical CGA/EGA CRT. 😉

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 15 of 38, by Rincewind42

User metadata
Rank Member
Rank
Member
VileR wrote on 2023-06-27, 05:40:

@Rincewind42: others have already answered your original question, but nice work on the CRT shader!
Although since you were saying 'authentic', I should point out that the phosphor dot pitch appears a lot finer than the typical CGA/EGA CRT. 😉

Thanks, and you're right. I've tweaked it further; how about these?

Unfortunately, I have only a couple of late 90s/early 2000s VGA monitors, but no EGA, so I need to rely on photo/video references alone which is far from optimal. Also, even on 4k trying to replicate the triad dot pattern is a fools errand, it's just not going to happen I think when you can want something that works well with different viewport sizes.

Ultimately, I just want something that I can slap on when emulating "true EGA" (with scanlines) a never think about it. Width-doubling 320-pixel content seems to be the ticket, now this shader works well with all EGA resolutions as shown below (well, well enough for me at least).

BRsVjqF.png

YkuC5Ax.png

LluqZEv.png

9VBlLyY.png

DOS: Soyo SY-5TF, MMX 200, 128MB, S3 Virge DX, ESS 1868F, AWE32, QWave, S2, McFly, SC-55, MU80, MP32L
Win98: Gigabyte K8VM800M, Athlon64 3200+, 512MB, Matrox G400, SB Live
WinXP: Gigabyte P31-DS3L, C2D 2.33 GHz, 2GB, GT 430, Audigy 4

Reply 16 of 38, by Jo22

User metadata
Rank l33t++
Rank
l33t++
VileR wrote on 2023-06-27, 05:40:

There's nothing "fake" about CGA's progressive scan. It's non-interlaced, hence progressive. You may point out that it's half the resolution you'd get from 'standard' NTSC progressive scan, but then again it's also double the frame rate. And progressive-scan NTSC was certainly not 'standard' at the CGA's introduction either way.

I disagree. It's fake, simply because it's not complying to the NTSC standard.
Non-interlaced operation isn't being specified/supported, so it's a form of hack.

Because, it's not as if the monitor suddenly stops to interlace the signal. It still does.
This pseudo "progressive" scan is just a butchered signal, really.

It's as if someone ripps off the wings of an airplane and says "look, it's a car!". 🙄😂

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 17 of 38, by Scali

User metadata
Rank l33t
Rank
l33t
Jo22 wrote on 2023-06-27, 12:47:

Because, it's not as if the monitor suddenly stops to interlace the signal. It still does.

No, technically it doesn't.
With interlacing, you have two fields per frame.
The odd field starts at scanline 1.
The even field starts at scanline 2. In order to actually position it below scanline 1 on the screen, physically, both fields have a 'half' scanline extra.
So NTSC is 525 lines in total, being an odd amount. A single field is 262.5 lines.
The extra half scanline is not actually displayed, but results in the odd field being shifted down vertically, because of how the horizontal and vertical sync pulses are timed. If the vsync starts in the first half of a scanline, the field is assumed to be odd, if it starts in the second half, it is assumed to be even.

That is what effectively makes it interlaced.

With a progressive image, you don't have even and odd fields. You only have a single type of field, which does not include the additional half scanline, so it is 262 lines exactly.
As a result, there is no vertical offset between even and odd fields on screen, and there is no actual interlacing (the vsync is always in the first half of a scanline, so all fields are considered odd). Every field is drawn on the exact same position on screen (so technically interlacing only happens when your image source specifically constructs a signal with even and odd fields and the correct timing required to display them in an interlaced manner . So it's the source that interlaces, not your monitor. Ironically enough, the 6845 is capable of generating interlaced timings, but the vsync logic is not implemented correctly on CGA to do so, resulting in a progressive signal even when the 6845 is set to interlacing).

The only thing that is non-standard here is that the timing is slightly off compared to interlaced NTSC, because you transmit half a scanline less per field.
This means that effectively the framerate is slightly higher than on an interlaced screen, and one could argue that this is 'outside spec', although in practice the difference is marginal enough that it never caused any issues.

So much for NTSC timing (when you use RGBI output on CGA/EGA).

In the case of composite output, there can be side-effects, because you are sending luma and chroma signals together, and the decoder in your TV/monitor has to separate them. This will lead to artifacts. Your TV/monitor may use something like a comb filter, assuming interlaced images, to reduce artifacts and increase quality.
Apparently, certain progressive-scan computers, such as the NES, actually send longer scanlines every other frame to get a better composite image, because the artifacts will move around instead of sitting statically on the screen. Like a very early form of temporal anti-aliasing.
This blog goes into more detail: https://sagargv.blogspot.com/2014/07/ntsc-dem … -demo-with.html

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 18 of 38, by Tiido

User metadata
Rank l33t
Rank
l33t

Yeah, it is not the monitor that interlaces, it is only a side effect of a half line before/at/after Vsync that the device feeding the montor is generating.
On the analog level the exact split point of a line determines how far down will the next field will shift down, and ideally it is kept exactly in middle of the line for an even spacing of the fields. In theory you can even interlace 3 fields into one frame, by doing not half but 1/3 and 2/3 of a line split on each of the fields. (and it will be very flickery 🤣).
As far as NTSC standard goes, the line rate is identical on CGA and EGA low res mode, and that is the most critical part and of course the color subcarrier which is what makes things NTSC and not PAL or SECAM or something else to begin with.

The simulated images look of a pretty good monitor, I am unsure the original EGA monitor looked that good but I have never actually seen one and photos on internet are too poor to make any judgment 🤣

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 19 of 38, by Scali

User metadata
Rank l33t
Rank
l33t
Tiido wrote on 2023-06-27, 14:16:

As far as NTSC standard goes, the line rate is identical on CGA and EGA low res mode

Yes, EGA is a bit of a weird standard.
The 200-line modes are deliberately designed to be backward-compatible with CGA monitors. Which ironically means that you're stuck with the 16 fixed RGBI colours in 320x200 and 640x200, even on an actual EGA monitor which has 64 RGBrgb colours.
There's a dip-switch to indicate that you are using a CGA monitor, so it disables the 350-line modes altogether (else it boots up in a 350-line textmode).

EGA monitors are a bit of an early hackish form of 'multiscan' or 'multisync'... The polarity of the sync signals is reversed for 350-line modes, so the monitor can easily detect whether to switch to CGA-compatible 200-line mode (which also makes it behave like a regular RGBI 16-colour monitor), or to 350-line mode.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/