VOGONS


First post, by Wes1262

User metadata
Rank Member
Rank
Member

I have a 9100 ati card that apparently lacks the ramdac chip and yet it offers both dvi and vga ports.
I thought ramdac was needed for converting the digital signal to vga. What gives? I must be completely wrong then.
Thanks!

Reply 1 of 9, by Horun

User metadata
Rank l33t++
Rank
l33t++

The R200 has built in RAMDAC. Don't forget as far back as the S3 Trio32 (or before) there were built in ramdacs afaik...

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 2 of 9, by Grzyb

User metadata
Rank l33t
Rank
l33t

It seems the first video chip with integrated RAMDAC was Acumos AVGA1 (1991), later rebranded to Cirrus Logic GD5401.

Nie rzucim ziemi, skąd nasz root!

Reply 3 of 9, by Wes1262

User metadata
Rank Member
Rank
Member

Why is that for some people the dedicated RAMDAC chip is so important then? I mean, they must have a reason. Thanks folks!

Reply 4 of 9, by jakethompson1

User metadata
Rank l33t
Rank
l33t
Wes1262 wrote on 2024-07-22, 21:27:

Why is that for some people the dedicated RAMDAC chip is so important then? I mean, they must have a reason. Thanks folks!

The lower end cards got the integrated RAMDAC first, so the last cards with a discrete RAMDAC were higher-end ones that people are more passionate about. And the decision of which RAMDAC was made by the board maker, not the SVGA chipset vendor, so it's an area where two apparently equivalent cards (going based on the SVGA chip alone) can actually be quite different in quality.

Reply 5 of 9, by Wes1262

User metadata
Rank Member
Rank
Member

but what is the difference in practice?

Reply 6 of 9, by mkarcher

User metadata
Rank l33t
Rank
l33t
Wes1262 wrote on 2024-07-22, 21:34:

but what is the difference in practice?

As the RAMDAC produces an analog signal, there is a wide spectrum between "good enough for some people" and "good enough to be indistinguishable from perfect for most people". The sharpness of the signal produced by a VGA card can not be better than the sharpness produced by the RAMDAC. Higher-end RAMDACs produce clearer images especially at high resolutions and refreshe rates. The main specification for sharpness is the analog bandwidth, which ideally is somewhat higher than the pixel clock you want to use.

Bad design of a VGA card can degrade the image quality, so a good RAMDAC does not guarantee a good image - but you can't fix a mediocre RAMDAC by improving the signal routing on the VGA card.

Matrox Millenium cards were well-known for their outstandingly good RAMDACs. On the other hand, as digital displays got common, the quality of the integrated RAMDACs in graphics chips went down quickly, especially in budget chips. This applies both to ATI (e.g. Radeon 9000-9250) as well as to nVidia (GeForce 2MX). They might still be good enough for 1280x1024 at 60 Hz, the highes resolution supported by consumer-class LCDs at that time, but on CRTs at higher rates (e.g. I had a 19" CRT at 1400 x 1050 at 75 Hz those days), I noticed a clear difference between my Radeon 7200 (good) and a (admittedly cheap) Radeon 9250. The German PC magazine c't at that time measured the signal quality and showed that cards with those chips were generally producing sub-par analog output.

In the early Super-VGA days, the capability of a card to display high-color and true-color graphics modes depended solely on the RAMDAC. Cheap cards usually had a 256-color only RAMDAC, while better cards might have a High-Color DAC (Sierra was the most prominent high-color DAC manufacturer those days, so you sometimes just hear people say "Sierra DAC" when they mean "high-color DAC"), or even True-Color DACs. This distinction mainly exists on low-to mid-level DRAM-based cards, whereas VRAM-cards have (in their high-resolution modes) a data path from video memory directly to the DAC, often 32 or even 64 bits wide, to allow very high bandwidth without pushing pressure on the VGA chip. Except for very early 8514/A-type cards, all VRAM-based cards with 32-bit data path include high-color and true-color modes.

Reply 7 of 9, by Wes1262

User metadata
Rank Member
Rank
Member

Thanks, great explanation!

Reply 8 of 9, by dionb

User metadata
Rank l33t++
Rank
l33t++
Wes1262 wrote on 2024-07-22, 21:34:

but what is the difference in practice?

That depends on specific cards.

But to take an obvious example, compare the more consumer-oriented Matrox Mystique with the more workstation-oriented Matrox Millennium. Both date to 1996, which was right in the transition period between discrete and integrated RAMDAC.

Mystique: internal 170MHz RAMDAC
Millennium: external 220MHz RAMDAC

Clearly, the Millennium has the faster and probably better RAMDAC. Does it matter?

RAMDAC speed determines maximum pixel clock and so the maximum refresh rates at a given resolution. A rule of thumb is that required RAMDAC bandwidth equals horizonal resolution x vertical resolution x refresh rate x 1.3. So 1024 x 768 x 85Hz x 1.3 (about the best you could expect to work on a mid-range 1996 monitor) = 87MHz. So barely half of what the 'worse' Mystique could do. When would it matter? Even at 1280x1024 you could go up to an unlikely 100Hz. Only once you hit 1600x1200 would it make a difference: with 170MHz you top out at eye-hurting 68Hz, whereas with 220MHz you could do 88Hz (i.e. 85Hz would work), which is pretty good.

So, reality check: in 1996 almost no one had a monitor that could do 1600x1200, let alone one that could do it at over 60Hz. But if you were one of the happy few who did, the faster discrete RAMDAC of the MIllennium would be a vastly better choice. And if you could afford that screen, the higher price of the Millennium wouldn't be an issue either.

The Mystique was also slower than the Millennium in Windows applications as it used SGRAM instead of the more expensive WRAM. That - rather than the RAMDAC - was a more likely reason for the Millennium working better for the 99.9% of us without ultra high-end monitors. But people still harp on about the RAMDAC...

In older cards there were other RAMDAC-related issues, which occurred on both discrete and integrated ones, but as integrated were more bottom-scraping back then you saw it more:
- a lot of RAMDACs (think: Cirrus Logic GD542x-series) didn't support extra colours or usable modes when upgrading to more video memory (usually 1MB->2MB)
- some RAMDACs simply didnt support more than 8b colour

Edit: looks like I type too slowly (or spend too much time researching/checking) 😉

Reply 9 of 9, by mkarcher

User metadata
Rank l33t
Rank
l33t
dionb wrote on 2024-07-22, 22:35:

- a lot of RAMDACs (think: Cirrus Logic GD542x-series) didn't support extra colours or usable modes when upgrading to more video memory (usually 1MB->2MB)

This is not primarily an issue with the RAMDAC, but with the data path to the RAMDAC. For 16bpp, there needs to be twice as much data to be sent to the RAMDAC then at 8bpp, and the GD542x just wasn't able to send data from video memory to the RAMDAC fast enough to make use of 2MB in non-interlaced modes. You do get an interlaced 1280x1024 @ 8bpp mode when you upgrade to 2MB. Some cards also offer a 1024x786 @ 16bpp interlaced mode after upgrading to 2MB, but this might not apply to the Cirrus cards. The 8-bit interface used for sending pixel data to the RAMDAC used in many entry- to mid-level RAMDACs at the time might be part of the bottleneck (at 16bpp, you to send bytes at twice the actual pixel clock, at 24bpp, you need to send bytes at thrice the pixel clock), but usually the RAMDAC interface was not limiting the data rates any more than the internals of the graphics chips already did.