VOGONS


First post, by retro games 100

User metadata
Rank l33t
Rank
l33t

I'm trying to learn the basics about what a RAMDAC does. I vaguelly remember back in the day, that 3dfx's early voodoo cards were 16-bit, and that nVidia came out top in the end, in terms of sales, because their early cards were 32-bit. Is that roughly correct?

I'm trying to understand the basics about color depth, in terms of hardware. Is this handled by the card's RAMDAC. So, would those early voodoo cards have a 16-bit RAMDAC, and those early nVidia cards have a 32-bit RAMDAC? Is it as simple and as straight forward as that?

I was looking at an old 1993 VLB card, and its RAMDAC was 24-bit. How come that old card has a better RAMDAC than a later era PCI voodoo card? Or have I got this all wrong in my mind?

Thanks a lot for any comments / clarification on this topic! 😀

Reply 1 of 17, by root42

User metadata
Rank l33t
Rank
l33t

I am not an expert either, but there are two aspects towards color depth and picture quality:

1. The Video processor / GPU has to support the desired bit depth, i.e. 8, 15, 16, 24 or 32 bit. Where 32 is mostly the same as 24 (8 bit per channel, but with either alpha information or padding bytes).

2. The RAMDAC has to support the bit depth on its input end and have sufficient high clock rate to generate lots of scanlines at a high refresh rate.

For the early Voodoo graphics cards, the GPU only supported 16 bit color depth, so the RAMDAC did not need to support more on its input end, while generating the analogue output signal.

YouTube and Bonus
80486DX@33 MHz, 16 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & GUSar Lite, PC MIDI Card+X2+SC55+MT32, OSSC

Reply 2 of 17, by David_OSU

User metadata
Rank Newbie
Rank
Newbie

The only difference between a RAMDAC and a DAC is the support for 8-bit pallet modes. The "RAM" part is a 256 entry look-up table (LUT) that outputs RGB. The output bit depth of the LUT matches the bit depth of the DACs. So a 24-bit LUT will output 8:8:8 (8 bits of Red, Green, and Blue) to the DACs, while a 16-bit LUT typically outputs 5:6:5 to the DACs (the green DAC is 6-bit, the other two are 5-bit). Note that there are 3 DACs, one for each color channel. In "true-color" modes, the LUT is not used and the RGB data is sent directly to the DACs.

Reply 3 of 17, by dionb

User metadata
Rank l33t++
Rank
l33t++

Also note that the speed of the RAMDAC determines the max pixel clock, so refresh rate at a given resolution.

This generally wasn't an issue when gaming, as CPU+3D chip limitations kept resolutions low, but for high-res desktop work (or 2D gaming think Civ2!) - I it could make a big difference. That was a big factor in why high-end VGA solutions were late in integrating the RAMDAC into the VGA core. If you compared that VLB RAMDAC to the Voodoo stuff, the Voodoo would probably have a much, much faster RAMDAC (at least, if it was a Banshee/V3/4/5 - the RAMDACs on the older cards were distinctly underwhelming).

Reply 4 of 17, by Zup

User metadata
Rank Oldbie
Rank
Oldbie
retro games 100 wrote:

I was looking at an old 1993 VLB card, and its RAMDAC was 24-bit. How come that old card has a better RAMDAC than a later era PCI voodoo card? Or have I got this all wrong in my mind?

"Bigger" is not always better.

You look at your RAMDAC looking only at one of the specifications, but not as the whole thing. As dionb said, other specs will determine maximum resolutions and refresh rates (although I think that early voodoo had less resolution than some SVGAs).

Also RAMDACs are semi-analog chips, and the quality output will vary depend on how well are they built. And when I was studying, I was told that (in some cases) would be better using a DAC with more resolution than needed to keep away from errors on the last bits (i.e.: using a 18 bit DAT when a 16 bit is needed and discarding the last 2 bits).

I have traveled across the universe and through the years to find Her.
Sometimes going all the way is just a start...

I'm selling some stuff!

Reply 5 of 17, by root42

User metadata
Rank l33t
Rank
l33t
Zup wrote:

Also RAMDACs are semi-analog chips, and the quality output will vary depend on how well are they built. And when I was studying, I was told that (in some cases) would be better using a DAC with more resolution than needed to keep away from errors on the last bits (i.e.: using a 18 bit DAT when a 16 bit is needed and discarding the last 2 bits).

Wouldn't that apply more to a ADC instead of a DAC? Or would the DAC simply set the lowest bits of the input always to 0?

YouTube and Bonus
80486DX@33 MHz, 16 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & GUSar Lite, PC MIDI Card+X2+SC55+MT32, OSSC

Reply 6 of 17, by retro games 100

User metadata
Rank l33t
Rank
l33t

Amazing info, people, thanks a lot!
Very roughly, when did integration of the RAMDAC into the VGA core become commonplace? Would that be after the VLB era, and roughly at the start of the PCI era? Thanks.

Reply 7 of 17, by gerwin

User metadata
Rank l33t
Rank
l33t
retro games 100 wrote:

Very roughly, when did integration of the RAMDAC into the VGA core become commonplace? Would that be after the VLB era, and roughly at the start of the PCI era? Thanks.

It differs
- The common Cirrus Logic VLB cards like GD5426 and GD5428 have the RAMDAC integrated. There is only one main IC on the PCB, besides RAM and BIOS.
- S3 based VLB cards are normally found with seperate RAMDAC. With the Trio64 and Trio32 it became integrated.

--> ISA Soundcard Overview // Doom MBF 2.04 // SetMul

Reply 8 of 17, by retro games 100

User metadata
Rank l33t
Rank
l33t
gerwin wrote:
It differs - The common Cirrus Logic VLB cards like GD5426 and GD5428 have the RAMDAC integrated. There is only one main IC on t […]
Show full quote
retro games 100 wrote:

Very roughly, when did integration of the RAMDAC into the VGA core become commonplace? Would that be after the VLB era, and roughly at the start of the PCI era? Thanks.

It differs
- The common Cirrus Logic VLB cards like GD5426 and GD5428 have the RAMDAC integrated. There is only one main IC on the PCB, besides RAM and BIOS.
- S3 based VLB cards are normally found with seperate RAMDAC. With the Trio64 and Trio32 it became integrated.

Re: CL VLB card. I wondered where it was, 🤣! Thanks!

Reply 9 of 17, by dionb

User metadata
Rank l33t++
Rank
l33t++
retro games 100 wrote:

Amazing info, people, thanks a lot!
Very roughly, when did integration of the RAMDAC into the VGA core become commonplace? Would that be after the VLB era, and roughly at the start of the PCI era? Thanks.

It started in late ISA period and lasted until early AGP.

Basically it was the same story as with everything else: the lowest low end got integrated first, then things went very fast during mainstream integration (around 1996) and the high-end held out longest.

Latest card I have with discrete RAMDAC is a Matrox G400 from 1999. It had an (excellent) integrated RAMDAC for the primary display, but an external one for the second display. Somewhat atypically the integrated DAC is better than the external one in this case.

By chip vendor fhe first integrated RAMDAC:
ATi Mach64
Cirrus Logic GD542x
Matrox Mystique (1064SG)
S3 Trio32
SiS 6202
Trident TVGA9000
Tseng ET6000

Reply 10 of 17, by root42

User metadata
Rank l33t
Rank
l33t

Which TVGA9000 variant had the integrated RAMDAC? I only have 9000s with external RAMDAC...

YouTube and Bonus
80486DX@33 MHz, 16 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & GUSar Lite, PC MIDI Card+X2+SC55+MT32, OSSC

Reply 11 of 17, by dionb

User metadata
Rank l33t++
Rank
l33t++
root42 wrote:

Which TVGA9000 variant had the integrated RAMDAC? I only have 9000s with external RAMDAC...

The TVGA9000i certainly has integrated RAMDAC:
319a6b58175c8348e5b537a311344d30_XL.jpg

Reply 12 of 17, by root42

User metadata
Rank l33t
Rank
l33t

Ah yes. To quote Wikipedia:

TVGA9000 (low component version)
TVGA9000B (1992)
TVGA9000C (1992) - External RAMDAC
TVGA9000i - (rev. a/b/c, 512 KB, 9000 with on-chip 15/16bit DAC and clock generator)
TVGA9000i-1 (1994) - appeared on Trident's VC512TM ISA video cards
TVGA9000i-2 (1994)
TVGA9100B - Slightly faster 9000

YouTube and Bonus
80486DX@33 MHz, 16 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & GUSar Lite, PC MIDI Card+X2+SC55+MT32, OSSC

Reply 13 of 17, by BloodyCactus

User metadata
Rank Oldbie
Rank
Oldbie

I have a passle of different RAMDAC chips for a project I was working on. Early RAMDAC like the original first one INMOS G171, and its clones were internally 3x6bit even tho they took 8bit data their output dac was 6bit. They were 35 or 50mhz. Now if you step up to like BrookTree BT485A, it came in 110-170mhz versions and had 3x8bit outputs for RGB indexed colour as well as 24bit truecolour output

--/\-[ Stu : Bloody Cactus :: [ https://bloodycactus.com :: http://kråketær.com ]-/\--

Reply 14 of 17, by root42

User metadata
Rank l33t
Rank
l33t

Standard VGA actually uses 3x6 bit for it's palettes to let you choose 256 colors from a range of 262,144. So it makes sense that the first RAMDACs were 6 bit internally.

YouTube and Bonus
80486DX@33 MHz, 16 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & GUSar Lite, PC MIDI Card+X2+SC55+MT32, OSSC

Reply 15 of 17, by BloodyCactus

User metadata
Rank Oldbie
Rank
Oldbie
root42 wrote:

Standard VGA actually uses 3x6 bit for it's palettes to let you choose 256 colors from a range of 262,144. So it makes sense that the first RAMDACs were 6 bit internally.

ah but, is that because the first RAMDAC, INMOS G171 had internal 6bit dacs, so did IBM create the standard around that??

heres a hint, the G171 was created before IBM created the VGA standard specifying a 3x6bit palette..... they just incorporated the inmos hardware specs into the vga standard..

--/\-[ Stu : Bloody Cactus :: [ https://bloodycactus.com :: http://kråketær.com ]-/\--

Reply 16 of 17, by root42

User metadata
Rank l33t
Rank
l33t

Yeah, I guess IBM looked at the market, which RAMDACs were available in quantities, and designed VGA around the one they picked as their "winner".

YouTube and Bonus
80486DX@33 MHz, 16 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & GUSar Lite, PC MIDI Card+X2+SC55+MT32, OSSC