VOGONS


First post, by maximus

User metadata
Rank Member
Rank
Member

Back in the day, video cards connected to CRT monitors using analog VGA connectors. Digital to analog conversion was the video card's job, and the monitor just displayed the analog signal it received. DAC quality varied greatly between video cards, with many cards producing dark or blurry signals.

Why was the DAC always on the video card? Why didn't cards send digital signals to CRT monitors and let the monitors do the digital to analog conversion? This would have made video cards simpler and cheaper and allowed high-end monitors to guarantee a clear picture. Using a digital connector should also make interference less of a problem. Why didn't anyone try this?

PCGames9505

Reply 1 of 14, by Cobra42898

User metadata
Rank Member
Rank
Member

In a term, it's backwards compatibility. Video cards that didn't provide an analog signal would be useless without a matching monitor. Digital monitors would also be much more expensive to make than their analog brethren, so nobody would want to buy one unless they bought a video card meant for it at the same time. Plus, nobody with an existing PC would buy a card since it wouldn't be compatible with existing monitors.

"We've always done it this way."

Searching for Epson Actiontower 3000 486 PC.

Reply 2 of 14, by root42

User metadata
Rank l33t
Rank
l33t

One problem might have been transmission of a high bandwidth digital signal over 1.5-2m of video cable. The bandwidth of a VGA signal is significantly higher than that of an EGA or CGA signal. Also it would have meant that the CRT would have needed some kind of framebuffer or at the very least a linebuffer. This would have even more increased the unit cost.

YouTube and Bonus
80486DX@33 MHz, 16 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & GUSar Lite, PC MIDI Card+X2+SC55+MT32, OSSC

Reply 3 of 14, by Scali

User metadata
Rank l33t
Rank
l33t

To add to root42's arguments, back then the signals weren't standardized. There were various different refresh rates, and various different standards for colours.
Because VGA had an analog interface, it was easy to make multisync monitors, and monitors that could easily support millions of colours.
Had the RAMDAC been integrated into the monitor instead of the videocard, then you were limited to the capabilities of your monitor. For example, standard VGA only did 3 resolutions, 640x400, 640x480 and 720x480. And it only had an 18-bit RGB palette.
Later videocards also supported 1024x768 interlaced, and full 24-bit RGB palettes, even on standard VGA monitors.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 4 of 14, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie

Yeah, if you look at it, every bit of history was a minor innovation over the previous tech -- all the way from black and white broadcast TV through VGA. Early computer monitors were just TVs. Then TVs with special inputs. Then dedicated monitors (but still using established connectivity interfaces for compatibility.)

DVI was a pretty big departure from the status quo, and even THAT still (optionally) carried analog VGA on the side pins just in case.

Reply 5 of 14, by Jo22

User metadata
Rank l33t++
Rank
l33t++
root42 wrote:

Also it would have meant that the CRT would have needed some kind of framebuffer or at the very least a linebuffer.
This would have even more increased the unit cost.

Interestingly, the SECAM television system relied on some sort of framebuffer technique, too.
From what I remember. the Atari 2600 console suffered the most from it. Only 8 colours remained because of it.
Nintendo with the NES circumvented this issue by using a PAL to RGB converter chip and including an EURO SCART cable.

SirNickity wrote:

Yeah, if you look at it, every bit of history was a minor innovation over the previous tech -- all the way from black and white broadcast TV through VGA. Early computer monitors were just TVs. Then TVs with special inputs.
Then dedicated monitors (but still using established connectivity interfaces for compatibility.)

Yup. 😁 In the early days there had been other experimental systems, like the Nipkow disk, but gratefuly the world settled for a common standard,
like it did with AM and FM radio (more or less). At least for B/W TV, which all TV sets are backwards compatible to
(their had been a few different line modes, though).

Speaking of black/white, old camera tubes were based on the monochrome part of Composite Video (Luma).
So in some way or another, this means that B/W systems can natively handle VBS or Composite internally.
A TV set without a VHF/UHF modulators/demodulator is basically a classic Video Monitor (like a Commodore 170x, but of lower quality).
Depending on how we look at it, a plain old video monitor is/was the basis of analogue televison. 😉
An RGB camera is basically a three channel B/W camera (three tubes) with colour filters.

SirNickity wrote:

DVI was a pretty big departure from the status quo, and even THAT still (optionally) carried analog VGA on the side pins just in case.

In practice it was for sure. In old laptops with LCDs, there had been another digital interface that was used for connecting the VGA card to the LCD panel. Not sure what it was called, though.

Anoher thing that comes to mind: Digital is not the same as binary. I'm speaking under correction, but I believe
Digital comes from Latin "Digitus", finger. So Digital can be understood as "fingered", in the sense of a comb.
So any signal with defined characteristics can be "digital", even though it is not binary per se. 😉

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 7 of 14, by Scali

User metadata
Rank l33t
Rank
l33t
keenmaster486 wrote:

CGA and EGA were digital out... weren't they?

Yes and no.
That is, they had binary signals for the colour (RGB and intensity signals).
But it was still a very 'analog' system in that the sync signals could be connected directly to the CRT circuitry, and the colour signals were also trivial to convert to CRT signals.
Basically VGA is exactly the same concept, except the intensity is 'encoded' inside the RGB signals because they are analog rather than just having a 0/1 (off/on) value.

DVI and HDMI are quite different, they are more like a network interface between devices (something like a bunch of RS232 links, more or less), and they send packets of binary data, which is then decoded on the receiving side, and stored inside a frame buffer for further processing.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 8 of 14, by keenmaster486

User metadata
Rank l33t
Rank
l33t

Very interesting!

You know, it always amazes me how an LCD monitor can display such a sharp signal from VGA in. If you think about it, there are two conversions happening here: video card converts digital to analog, and LCD converts analog back to digital... then it becomes obvious why monitors switched to DVI and HDMI. But still, the picture is surprisingly good on an LCD considering what the signal goes through to get there.

World's foremost 486 enjoyer.

Reply 9 of 14, by Scali

User metadata
Rank l33t
Rank
l33t
keenmaster486 wrote:

But still, the picture is surprisingly good on an LCD considering what the signal goes through to get there.

Yea, VGA can be quite good, when both the RAMDAC and the ADC in the LCD are doing a good job.
But it is also more susceptible to poor signal quality than a CRT is.
On many of my old VGA cards, I get blurry images and banding on LCD screens. But stick a Matrox in there, and you can't really tell the difference with DVI.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 10 of 14, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie
keenmaster486 wrote:

CGA and EGA were digital out... weren't they?

This is actually the kind of confusion that stems from how we compartmentalize analog and digital signals. You have to understand, analog and digital are concepts, not absolutes. You can send a digital signal through an analog amplifier to regenerate its voltage levels. You can use binary switching (PWM) and low-pass filtering to generate an analog waveform. A digital signal is always analog, it's just interpreted as a two-state signal. Which is why it's a problem when the signal quality deteriorates to the point where those thresholds aren't clear.

That is to say, CGA and EGA are generated by simple color mixing. Red is on, green is on, you have yellow. Turn green off you have red. Turn blue on and you get purple. There's not much granularity, so the signal can be generated by discrete logic. Nonetheless, the analog monitor interprets this as an analog signal. The "on" of red isn't an absolute, it's (for e.g.) 0.7V intensity on the red electron beam.

When VGA and then SVGA introduced wider color palettes, they just introduced more steps to those outputs. So now red isn't just off/on, but 0-255; and that can be mixed with a green signal of 0-255, and a blue one as well. Now you've got 16M distinct color combinations instead of 16. The analog monitor could go further still, though the electronics' noise floor, the phosphor, and our eyes may not be able to generate and detect any additional meaningful resolution. In reality, it's always wavering around those discrete values due to noise and conversion errors and fluctuations in the supply voltage.

The world is fuzzy. 😀

Reply 11 of 14, by Jo22

User metadata
Rank l33t++
Rank
l33t++
SirNickity wrote:

The world is fuzzy. 😀

You're speaking wise words of truth, my friend. 😀

I recall there are endless debates wether or not telegraphy using Morse code (aka CW; continous wave) is a digital mode or not.
Some people argue that it is, because it uses defined pulse lengt for dot and dashes (a dash is so and so long in comparion to a dot),
while other say it is not, because the tempo of transmission is not standardized and may vary depending on each operator (human factor).

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 12 of 14, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie

There are MANY digital protocols where the clock is encoded in the data. If that's the metric chosen to determine the nature of Morse code, I don't see how it would differ from RS-232 with baud rate detection. 😀

Reply 13 of 14, by GordonFreeman

User metadata
Rank Newbie
Rank
Newbie

Like others mentioned, there was TTL RGB. Also, you’re talking about monitors, but there were some later CRT TVs that had HDMI inputs. These had a built-in DAC.

Somewhat related to the discussion of analog vs. digital: one of the first systems for storing PCM audio involved converting the data to analog video on a tape, then converting back to digital.

Reply 14 of 14, by retardware

User metadata
Rank Oldbie
Rank
Oldbie
Jo22 wrote:
Interestingly, the SECAM television system relied on some sort of framebuffer technique, too. From what I remember. the Atari 26 […]
Show full quote
root42 wrote:

Also it would have meant that the CRT would have needed some kind of framebuffer or at the very least a linebuffer.
This would have even more increased the unit cost.

Interestingly, the SECAM television system relied on some sort of framebuffer technique, too.
From what I remember. the Atari 2600 console suffered the most from it. Only 8 colours remained because of it.
Nintendo with the NES circumvented this issue by using a PAL to RGB converter chip and including an EURO SCART cable.

The storage device was an ultrasonic delay line, which stored 64 microseconds of phase information at a bandwith of (iirc) about 800kHz.
The horizontal and vertical clock signals were separated by the vertical and horizontal separators, consuming 14 of the 64 microseconds of the line.

PAL and SECAM differed in what they stored in the delay line. SECAM transmitted one color, then transmitted the other color in the next line, so you could get all three colors by subtraction. PAL transmitted the color information with normal phase in even lines, and with inverted phase in odd lines, so the colors evened out differentially. For this reason PAL was the analog color system with the least color distortions.
Both PAL and SECAM learned from the "never the same color" principle, which did not use memory devices.

I mean, clock and data, data separators, sort of a serial protocol, isn't this digital? *confused*
And even on HDMI the digital data that is transmitted... it looks very harmonically... *evenmoreconfused*