VOGONS


First post, by LightStruk

User metadata
Rank Newbie
Rank
Newbie

I've had laptops and desktops with composite video output over the years, but I have none of them anymore. I seem to remember text mode, as well as various DOS games, working fine on these video cards with TV out. Yet, weren't these running at 70 Hz, not 60 or 59.94? How did that even work?

Did the video card's TV-out output an interlaced 70 Hz signal and assume that an NTSC TV could sync to it? Did the video card buffer entire frames from the input with VSync off, and then output interlaced at 59.94 Hz with screen tearing during motion?

For that matter, what does the vintage CGA or Tandy 1000 graphics hardware do? They have composite video output. Do they drive a 70 Hz interlaced signal?

Reply 1 of 18, by leileilol

User metadata
Rank l33t++
Rank
l33t++

I think my Voodoo3 dropped some frames with some 13H games when I used to TV out with that. However there's some games where 70hz is retained (often EGA modes) and end up scrambling up the picture.

CGA (as in proper CGA hardware) should be 60hz IIRC.

apsosig.png

Reply 2 of 18, by Benedikt

User metadata
Rank Member
Rank
Member

Both, CGA and EGA, are strictly 60Hz, only¹. 70Hz text modes are a VGA thing.
The text mode they output to a composite screen typically uses 80x25 8x8 pixel character cells, which results in an effective resolution of 640x200 pixels, which in turn means that no interlacing is required.
However, my ATi EGA Wonder 800+ can output 640x350 pixel EGA modes to a CGA screen via interlacing. So if it had a composite output, a TV set would work, as well.

¹ Ignoring EGA with MDA screen.

Reply 3 of 18, by Jo22

User metadata
Rank l33t
Rank
l33t
Benedikt wrote on 2020-07-30, 15:10:
Both, CGA and EGA, are strictly 60Hz, only¹. 70Hz text modes are a VGA thing. The text mode they output to a composite screen ty […]
Show full quote

Both, CGA and EGA, are strictly 60Hz, only¹. 70Hz text modes are a VGA thing.
The text mode they output to a composite screen typically uses 80x25 8x8 pixel character cells, which results in an effective resolution of 640x200 pixels, which in turn means that no interlacing is required.
However, my ATi EGA Wonder 800+ can output 640x350 pixel EGA modes to a CGA screen via interlacing. So if it had a composite output, a TV set would work, as well.

¹ Ignoring EGA with MDA screen.

Impressive, this makes me wish all EGA compatible third-party boards were so well-thought-out like that particular model..
I mean, the CGA monitor is essentially nothing more than a glorified US TV set without a tuner, but a pass-through to the RGB tubes.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 4 of 18, by LightStruk

User metadata
Rank Newbie
Rank
Newbie
Jo22 wrote on 2020-07-30, 19:00:

I mean, the CGA monitor is essentially nothing more than a glorified US TV set without a tuner, but a pass-through to the RGB tubes.

Uh, is it though?

CGA, EGA, and Tandy graphics use digital TTL signals for R, G, B, and intensity, so 4 bits of information, leading to a maximum of 16 possible colors, nothing at all like the analog composite, separate luma and chroma, or component YPrPb signals that an NTSC TV accepts. The same picture out of a CGA card sent to an RGBI monitor and to an NTSC TV look drastically different, and not in a "the TV tint knob needs adjusting" way. The RGBI signal has a distinct "brown" requiring special circuitry in the monitor, while the composite signal gets a dark yellow.

Reply 5 of 18, by Jo22

User metadata
Rank l33t
Rank
l33t
LightStruk wrote on 2020-07-31, 01:53:
Jo22 wrote on 2020-07-30, 19:00:

I mean, the CGA monitor is essentially nothing more than a glorified US TV set without a tuner, but a pass-through to the RGB tubes.

Uh, is it though?

CGA, EGA, and Tandy graphics use digital TTL signals for R, G, B, and intensity, so 4 bits of information, leading to a maximum of 16 possible colors, nothing at all like the analog composite, separate luma and chroma, or component YPrPb signals that an NTSC TV accepts. The same picture out of a CGA card sent to an RGBI monitor and to an NTSC TV look drastically different, and not in a "the TV tint knob needs adjusting" way. The RGBI signal has a distinct "brown" requiring special circuitry in the monitor, while the composite signal gets a dark yellow.

Ok, my English is not so good. I try to rephrase what I said. I mean to say that CGA circiutry is based on the televison standard (just like analog video monitors are).
The timings, the "clock" pin CGA derives from PC/ISA bus etc. It's all based on bog standard TV specs.

The RGB timing is nothing special, either. Every NTSC TV uses RGB internally (as does PAL/SECAM) - At least in case of real CRT TVs (they have three sepparate electron guns with RGB).
And "intensity" is always used inside the TV, too. There's just a little circuit inside the CGA monitor that interfaces accordingly with the "Intensity" pin.

Strictly speaking, that's also true for Commodore 128 and other old home computers with TTL RGB.
Also, TTL RGB is pretty much the same as analog RGB. Simply put, TTL RGB is a subset of analog RGB.

The difference is, that TTL is digital. As in, "fingered" or "predefined". It's lika a filter.
It does use known, absolute values -or states- (on/off, uses square wave), goes rail-to-rail so (max-min) to say .

Analog is steady, continues, with varrying intensity.
Values are smooth by comparison (sine).

That's why you can feed an TTL video signal into an analog monitor (SCART TV etc),
albeit with lower quality as a result (unwanted noise can mix in).

Edit: Some typos fixed.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 6 of 18, by LightStruk

User metadata
Rank Newbie
Rank
Newbie
Jo22 wrote on 2020-07-31, 10:37:

Also, TTL RGB is pretty much the same as analog RGB. Simply put, TTL RGB is a subset of analog RGB.

I wonder if the RGB primaries are the same between a CGA / RGBI monitor and an NTSC TV. (For example, NTSC, PAL, and ATSC all have different RGB primaries - their colorspaces are not identical.) Another way of putting it would be - are all 16 colors of CGA / EGA in the NTSC gamut? Are the red, green, and blue phosphors in an RGBI monitor the same as the phosphors in an NTSC TV? Are the differences between the colors seen on an NTSC TV vs on an RGBI monitor coming from a CGA card caused by an imperfect composite conversion in the CGA card, or are the differences inherent to the monitors themselves?

Reply 7 of 18, by root42

User metadata
Rank Oldbie
Rank
Oldbie
Benedikt wrote on 2020-07-30, 15:10:

However, my ATi EGA Wonder 800+ can output 640x350 pixel EGA modes to a CGA screen via interlacing. So if it had a composite output, a TV set would work, as well.

I recently acquired an 800+ as well but didn’t test it yet. Does it support the enhanced palette as well then? Couldn’t be, right? Because CGA monitors are RGBI only?

YouTube and Bonus
80386DX@25 MHz, 8 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & BlasterBoard, PC MIDI Card + SC55 + MT32, XT CF Lite, OSSC

Reply 8 of 18, by VileR

User metadata
Rank Oldbie
Rank
Oldbie
LightStruk wrote on 2020-07-31, 12:17:

I wonder if the RGB primaries are the same between a CGA / RGBI monitor and an NTSC TV. (For example, NTSC, PAL, and ATSC all have different RGB primaries - their colorspaces are not identical.) Another way of putting it would be - are all 16 colors of CGA / EGA in the NTSC gamut? Are the red, green, and blue phosphors in an RGBI monitor the same as the phosphors in an NTSC TV?

The color spaces are not identical - NTSC uses YIQ, not RGB. Of course when decoded for output, it's converted into voltage levels for R, G and B guns, but the gamut is different from what you'd get by directly feeding RGB component video. (paradoxically some NTSC colors can also be outside the nominal RGB space, since NTSC black is not quite zero voltage, and some YIQ combinations yield levels "blacker than black" for one or more of the electron guns).

This doesn't have much to do with the phosphors though- AFAIK the actual R, G and B primaries in the *monitors* are the same. There was some discussion about this in one of the old threads about composite CGA emulation for DOSBox. IIRC, there *is* a certain YIQ-to-RGB conversion standard that mandates correction for the primaries, but it turned out that it only applies to how stations encode signals for over-the-air broadcast. Or something like that, but nothing inherent to monitor design anyway.
(Most professional-grade video monitors can decode YIQ/NTSC as well as both analog and digital RGB, all on the same phosphor-coated screen. If some monitors/TV use a different set of RGB phosphors then I'd imagine they have to correct for it internally.)

*Gamma* might be different but that's more down to monitor/TV design rather than the signal encoding.

Are the differences between the colors seen on an NTSC TV vs on an RGBI monitor coming from a CGA card caused by an imperfect composite conversion in the CGA card, or are the differences inherent to the monitors themselves?

More like the former. CGA generates digital color by combining R, G, B and I, but composite colors are generated by taking a 3.58 MHz waveform at a 50% duty cycle, and rotating it through a set of 6 possible phases - blue, green, cyan, red, magenta, yellow. Phase determines the I and Q components, then greys and intensity levels are created by adding Y (luminance).
(That goes for "direct" colors, aka the composite equivalents of the 16-color RGBI palette. "Artifact" colors are generated simply by switching between these basic waveforms at 2x or 4x the frequency, so when the higher-frequency components are removed, you have a "new" periodic 3.58 MHz signal.)

I go into more details about this here: https://int10h.org/blog/2015/04/cga-in-1024-c … olors_composite

That's a pretty crude method of generating NTSC color. I guess IBM did this to cut costs, since composite was considered the crap option - resolution was low, and most consumers would've seen it only on TVs through RF converters, which made things even worse.

Funny thing is, using artifact colors, you can actually generate a custom composite "palette" that's not far from the RGBI one - much closer than the "direct" composite CGA palette, anyway. That's also something we briefly exploited in 8088 MPH: https://int10h.org/blog/2015/08/8088-mph-fina … n_with_palettes
You can't quite get all the way there, but that's mostly because CGA just doesn't give you the full flexibility of manipulating the composite waveform... at any rate, it certainly isn't due to monitor design 😀

web  /   blog   /   tube

Reply 9 of 18, by VileR

User metadata
Rank Oldbie
Rank
Oldbie
root42 wrote on 2020-07-31, 12:56:

I recently acquired an 800+ as well but didn’t test it yet. Does it support the enhanced palette as well then? Couldn’t be, right? Because CGA monitors are RGBI only?

Yeah, can't. The monitor would need to understand 6-pin "RGBrgb" signals to get the extended EGA palette.
Although I wouldn't put it beneath ATI to take advantage of the interlacing to create extra fake "colors" (temporal dithering) as well. 😉

Last edited by VileR on 2020-07-31, 14:08. Edited 1 time in total.

web  /   blog   /   tube

Reply 10 of 18, by Benedikt

User metadata
Rank Member
Rank
Member

(VileR was apparently a bit faster, so the text below might repeat a few things.)

LightStruk wrote on 2020-07-31, 12:17:
Jo22 wrote on 2020-07-31, 10:37:

Also, TTL RGB is pretty much the same as analog RGB. Simply put, TTL RGB is a subset of analog RGB.

I wonder if the RGB primaries are the same between a CGA / RGBI monitor and an NTSC TV. (For example, NTSC, PAL, and ATSC all have different RGB primaries - their colorspaces are not identical.) Another way of putting it would be - are all 16 colors of CGA / EGA in the NTSC gamut? Are the red, green, and blue phosphors in an RGBI monitor the same as the phosphors in an NTSC TV? Are the differences between the colors seen on an NTSC TV vs on an RGBI monitor coming from a CGA card caused by an imperfect composite conversion in the CGA card, or are the differences inherent to the monitors themselves?

The color spaces of the color TV standards are irrelevant in this case, because TV sets and RGBI screens were literally using the same type of picture tube and that was not a coincidence.
By the 1980s, color TV sets were a mass market product, whereas computers with their own type of color screen were a niche product. Using ordinary TV picture tubes meant low prices and good availability.
Besides, IBM wanted to throw the PC on the market, quickly. (Strictly speaking, the IBM 5153 is a 1983 product though, i.e. primarily meant for the IBM 5160.)
The discrepancy between CGA RGBI colors and CGA composite colors is meaningless, because no CGA card ever had a proper NTSC encoder.
They would only approximate a somewhat matching color signal with a bunch of resistors.
The artifact colors are even less meaningful, because they should not even exist. They are a side effect of how NTSC works.

root42 wrote on 2020-07-31, 12:56:
Benedikt wrote on 2020-07-30, 15:10:

However, my ATi EGA Wonder 800+ can output 640x350 pixel EGA modes to a CGA screen via interlacing. So if it had a composite output, a TV set would work, as well.

I recently acquired an 800+ as well but didn’t test it yet. Does it support the enhanced palette as well then? Couldn’t be, right? Because CGA monitors are RGBI only?

They found a way to support all 64 colors. While the documentation does not say how they did it, I strongly assume that it is primitive PWM.

Reply 11 of 18, by maxtherabbit

User metadata
Rank Oldbie
Rank
Oldbie
Jo22 wrote on 2020-07-31, 10:37:

And "intensity" is always used inside the TV, too. There's just a little circuit inside the CGA monitor that interfaces accordingly with the "Intensity" pin.

that's not correct, the concept of "intensity" as a distinct signal is unique to digital TTL RGB implementations

analog RGB monitors simply gain up the analog R, G, and B signal inputs to drive the guns directly

Reply 12 of 18, by Benedikt

User metadata
Rank Member
Rank
Member
maxtherabbit wrote on 2020-07-31, 14:16:
Jo22 wrote on 2020-07-31, 10:37:

And "intensity" is always used inside the TV, too. There's just a little circuit inside the CGA monitor that interfaces accordingly with the "Intensity" pin.

that's not correct, the concept of "intensity" as a distinct signal is unique to digital TTL RGB implementations

analog RGB monitors simply gain up the analog R, G, and B signal inputs to drive the guns directly

I'd have to disagree. Practically every analog color monitor has an intensity pot (not always externally accessible) that controls the gain of the amplifiers. (I.e. a brightness knob)
You shouldn't need much more than a resistor to wire that up to the RGBI intensity signal.

Reply 13 of 18, by VileR

User metadata
Rank Oldbie
Rank
Oldbie
Benedikt wrote on 2020-07-31, 14:07:

The artifact colors are even less meaningful, because they should not even exist. They are a side effect of how NTSC works.

A side effect indeed, but as widely known, on the Apple II it was the only method of generating colors - so Apple accordingly advertised them as meaningful. 😉
IBM, in contrast, flatly ignored this method in all documentation - even though they must've been aware of how Wozniak accomplished it, and indeed published software titles that used the same method on CGA. Lack of acknowledgement/support by IBM was probably the major factor behind the erroneous perception that "CGA is even worse than Apple II video, lol".

They found a way to support all 64 colors. While the documentation does not say how they did it, I strongly assume that it is primitive PWM.

PWM might do the trick, too. I wonder if the crystal frequencies on the card may be a clue.
If anyone with an EGA Wonder also has a 'scope and some spare time, this might be interesting to find out...

web  /   blog   /   tube

Reply 14 of 18, by Benedikt

User metadata
Rank Member
Rank
Member
VileR wrote on 2020-07-31, 14:24:

They found a way to support all 64 colors. While the documentation does not say how they did it, I strongly assume that it is primitive PWM.

PWM might do the trick, too. I wonder if the crystal frequencies on the card may be a clue.
If anyone with an EGA Wonder also has a 'scope and some spare time, this might be interesting to find out...

There are three big crystal oscillators on the ATi EGA Wonder 800+: 36.0000MHz, 50.0000MHz and 56.6440MHz.
I should mention that it is technically a backwards compatible VGA card without RAMDAC and without VGA port.
It does support 640x480 and 800x600 modes, as well, but not on a CGA screen.

Reply 15 of 18, by maxtherabbit

User metadata
Rank Oldbie
Rank
Oldbie
Benedikt wrote on 2020-07-31, 14:20:
maxtherabbit wrote on 2020-07-31, 14:16:
Jo22 wrote on 2020-07-31, 10:37:

And "intensity" is always used inside the TV, too. There's just a little circuit inside the CGA monitor that interfaces accordingly with the "Intensity" pin.

that's not correct, the concept of "intensity" as a distinct signal is unique to digital TTL RGB implementations

analog RGB monitors simply gain up the analog R, G, and B signal inputs to drive the guns directly

I'd have to disagree. Practically every analog color monitor has an intensity pot (not always externally accessible) that controls the gain of the amplifiers. (I.e. a brightness knob)
You shouldn't need much more than a resistor to wire that up to the RGBI intensity signal.

ok fair point but the monitors still were never intended to consider intensity as a variable input signal, it's a set it and forget it internal adjustment

Reply 16 of 18, by LightStruk

User metadata
Rank Newbie
Rank
Newbie

So it's clear that folks here know a lot about colors, which is awesome and informative. That said, does anyone have a VGA card (or better) with composite or S-Video output that they can connect to a TV to see what happens in mode 13h (320x200x8bpp@70 Hz) and 80x25 text mode? Is it still interlaced? During fast motion, does tearing become visible?

Reply 17 of 18, by Tiido

User metadata
Rank Oldbie
Rank
Oldbie

All the video cards I have with composhit/S-video output always show 576i (50Hz) or 480i (60Hz), and typically with tearing, not skipped frames. Result is a blurry and flickery mess that doesn't look all that nice for the most part. I don't have access to a working TV right now to test out various cards I have here and report what exactly do they do. Motion is always full of combing artifacts due to interlaced output, I haven't seen a single card that showed 288p or 240p at the low resolutions, all always show 576i or 480i.

70Hz is not normally possible on the TV outs (one could probably reprogram the relevant hardware on some cards), and majority of TVs will not support such a signal even if line rate is correct, since 70Hz falls out of range of the frame PLLs in most designs I have looked at (majority center at 55Hz with less than 10Hz lock in range, which is enough for 50 and 60Hz that is ever expected to be encountered).

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜