VOGONS


First post, by Sunflux

User metadata
Rank Newbie
Rank
Newbie

In my quest to upgrade my 98SE system with better graphics, I’d decided on getting a GPU with DVI to hopefully provide both future compatibility with LCDs, and a better overall image quality.

But I’m realizing now that the vast majority - if not all - early DVI cards simply had a Silicon Image chip on them that took the potentially blurry RAMDAC analog output and re-digitized it. I’d probably rather stick to VGA and let my monitor do the conversion!

So my question is… when did this practice stop, and what’s the oldest card you can get with a true digital output that bypasses the whole analog phase?

EDIT: like, I’m looking at photos of the FX5900 and am still seeing that suspicious Silicon Image chip… so is my hunt a lost cause?

Reply 4 of 7, by darry

User metadata
Rank l33t++
Rank
l33t++
Sunflux wrote on 2022-03-15, 16:37:

But I’m realizing now that the vast majority - if not all - early DVI cards simply had a Silicon Image chip on them that took the potentially blurry RAMDAC analog output and re-digitized it. I’d probably rather stick to VGA and let my monitor do the conversion!

So my question is… when did this practice stop, and what’s the oldest card you can get with a true digital output that bypasses the whole analog phase?

EDIT: like, I’m looking at photos of the FX5900 and am still seeing that suspicious Silicon Image chip… so is my hunt a lost cause?

Respectfully, you are off the mark on that assumption.

The Silicon Image chip you are referring to is a TMDS transmitter ( https://en.wikipedia.org/wiki/Transition-mini … ntial_signaling ) that generates a DVI (or HDMI) signal from the internal (and digital also) digital output of the GPU chip . This functionality, like that of the RAMDAC, was later also integrated into the GPU core . The RAMDAC and TMDS functionality are distinct and independent of each other, whether integrated on the GPU chip or not .

Whether the TMDS functionality is integrated into the GPU or not does not influence picture quality (when working correctly) . Additionally, some early GPU integrated TMDS implementations were problematic (compatibility issues), so external TMDS were sometimes still used, even if not mandatory . Sometimes external TMDS chips were also used because the integrated TMDS was too limited (i.e. when dual-link DVI functionality was desired, but the internal TMDS did not feature that).

If your use case is purely Windows, using DVI or HDMI, makes sense from an image quality perspective .

If you intend to use DOS games (whether under pure DOS or in Windows), you will find that real world DVI/HDMI implementation have issues and limitations that usually make them either sub-optimal or even unusable . If you would like more information on this, please let us know (this has been discussed in several threads) .

Regards,
Darry

Reply 5 of 7, by Sunflux

User metadata
Rank Newbie
Rank
Newbie

I guess I was led off the mark by this old thread:

Geforce 2 recommendation request

the problem is that the video signal from the DVI is general very poor because its goes through the RAMDAC and then to the SiliconImage Chip to get Digitalized again. like on most other cards from this time

My searches didn’t lead to anything more informative.

Anyhow, I assume if DVI proves problematic in DOS, even if the card has dual DVI-I (like some Quadros), I’ll just use one of the billion DVI to VGA adapters I seem to have (I assume such old cards would still have full analog support).

Reply 6 of 7, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

For best passive DVI to HDMI compatiblity, GF6 series is best (and there are AGP 2x/3,3V compatible models out there).
DVI to HDMI/DP on GF3/4/5 and old ATIs... (up to X8x0 series) - isn't great, it works usually but can be weird (example : no-video on BIOS/POST screen, but works on Windows itself [if drivers were installed]) 🙁

For DVI to DVI though, I'd go with GF4 Ti or FX.

157143230295.png

Reply 7 of 7, by libv

User metadata
Rank Newbie
Rank
Newbie
Sunflux wrote on 2022-03-15, 21:11:
I guess I was led off the mark by this old thread: […]
Show full quote

I guess I was led off the mark by this old thread:

Geforce 2 recommendation request

the problem is that the video signal from the DVI is general very poor because its goes through the RAMDAC and then to the SiliconImage Chip to get Digitalized again. like on most other cards from this time

My searches didn’t lead to anything more informative.

That guy has an issue with information literacy.

The typical Panellink/DFP/DVI tmds encoder of the day was the Sil164, and its datasheet was publicly available even in 2010. He could have just gone and looked it up and seen that it uses a 24bit parallel bus. My earliest card with a DFP connector is a savage 4 with a SIL140. You can find a VT8501(MVP4) datasheet where the 24bit connection to the encoder is listed as well.

The same is true for older cards, those with an external VGA ramdac. Something which vanished in the mid 90s with the introduction of the Trio64V. My Tseng ET4000s have either an ST micro ST1703, chrontel CH8398 or ICS Gendac ICS5342 which all have 8/16bit parallel pixel busses. Again datasheets are available.

_All_ external TV, DFP and DVI encoders, as TV/VGA decoder chips, all talk to the display engine over a digital bus. The same is true for directly attached LCDs, and for the VIP port, or VESA feature connector. Back in the 2000s, this was a parallel bus, usually 8/16/24bits wide. It's only in the last 15-20years, with PCIe and MIPI, that this turned into digital serial busses.

Digital parallel busses are easy, especially for the 100 or so MHz that you need if your bus is full width and you are not using higher than 1920x1200 resolutions (single link DVI is FullHD with reduced blanking at 165MHz max). Analog busses are hard to route and shield, as any interference directly influences the displayed colours. Internally, on the die, it is the same. All is digital and it is the DAC part that is hard to keep linear and clean as processes shrink.

-- The guy who came up with modesetting, a misnomer for structured display driver development, which was borne out of a desire to rid linux display drivers of BIOS/Int10 dependence.