VOGONS


First post, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie

I have seen a video of someone do this with a display port GPU to a VGA CRT by having the external DAC plugged directly into the CRT with a male-to-female VGA adapter. However, if you had a DVI or VGA GPU is something like this possible where you could get the best possible quality with no degradation by converting the analog signal at the GPU to digital, and then at the CRT back to analog?

Reply 1 of 10, by darry

User metadata
Rank l33t++
Rank
l33t++
infiniteclouds wrote on 2020-06-14, 05:57:

I have seen a video of someone do this with a display port GPU to a VGA CRT by having the external DAC plugged directly into the CRT with a male-to-female VGA adapter. However, if you had a DVI or VGA GPU is something like this possible where you could get the best possible quality with no degradation by converting the analog signal at the GPU to digital, and then at the CRT back to analog?

Conversion from analogue to digital and then to analogue again would likely generate more degradation then simply feeding the original analogue signal to the CRT with a high quality cable .

Reply 2 of 10, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie

Does the conversion cause degradation, itself though? I thought it was the signal passing through that caused the degradation -- so a longer cable would have more signal loss than a shorter one?

Reply 3 of 10, by darry

User metadata
Rank l33t++
Rank
l33t++
infiniteclouds wrote on 2020-06-14, 06:48:

Does the conversion cause degradation, itself though? I thought it was the signal passing through that caused the degradation -- so a longer cable would have more signal loss than a shorter one?

Yes, the AD conversion is not a perfect process . This is a complex process to explain (and mostly outside my field of expertise), but comes down to the inevitable quantization error and the fact that neither the ADC or DAC used are going to be perfect . This explains the concept, to a point . https://www.designnews.com/content/oversampli … /57569490133987

That said, there are 3 points to consider.

a) Whether transmitting an analogue VGA signal over 6 feet to a CRT using high quality cable causes any perceptible degradation compared to using a theoretical zero-length cable .

b) Whether converting an analogue VGA signal to digital and back to analogue again causes any perceptible degradation compared to staying in the analogue domain and using a theoretical zero-length cable

c) which of a) or b) is most noticeable, which would take double-blind to determine .

This brings is back to the fundamental question of "Is there really a problem to be solved here ?"

My uneducated guess is that there isn't and that a 6 foot run through a high quality VGA cable is not going to have a signficant impact on picture quality (especially on a CRT) .

Reply 4 of 10, by wiretap

User metadata
Rank Oldbie
Rank
Oldbie

Even if you found the perfect conversion with no signal quality loss, you're just adding latency with each conversion. It is better to just run an analog connection to an analog display.

My Github
Circuit Board Repair Manuals

Reply 5 of 10, by dionb

User metadata
Rank l33t++
Rank
l33t++
wiretap wrote on 2020-06-14, 09:56:

Even if you found the perfect conversion with no signal quality loss, you're just adding latency with each conversion. It is better to just run an analog connection to an analog display.

Is it?

There is no such thing as an analog source. Computer display signals start digital in all cases. With DP output, there's no AD-conversion, so with the proposed solution there's just a single DA-converter at the CRT end. Using VGA there's also just a single converter, in this case at the PC side. If the converters are comparable, latencies will be too.

As for analog signal quality, that's a matter of the quality of the external DA converter vs internal DA converter + analog cable. I'd be inclined to suspect that the internal DA converters will tend to be better than external ones, but that is by no means inevitable, particularly as analog VGA out on cards has become a decidedly low-end feature, so every possible cost-cutting will be applied on a card new enough to have DP out but also VGA.

Personally I'd say that if you have this choice, you're using the wrong card and you'd be better off with an older, more high-end card with DVI-I out.

Reply 6 of 10, by wiretap

User metadata
Rank Oldbie
Rank
Oldbie
dionb wrote on 2020-06-14, 12:14:
Is it? […]
Show full quote
wiretap wrote on 2020-06-14, 09:56:

Even if you found the perfect conversion with no signal quality loss, you're just adding latency with each conversion. It is better to just run an analog connection to an analog display.

Is it?

There is no such thing as an analog source. Computer display signals start digital in all cases. With DP output, there's no AD-conversion, so with the proposed solution there's just a single DA-converter at the CRT end. Using VGA there's also just a single converter, in this case at the PC side. If the converters are comparable, latencies will be too.

As for analog signal quality, that's a matter of the quality of the external DA converter vs internal DA converter + analog cable. I'd be inclined to suspect that the internal DA converters will tend to be better than external ones, but that is by no means inevitable, particularly as analog VGA out on cards has become a decidedly low-end feature, so every possible cost-cutting will be applied on a card new enough to have DP out but also VGA.

Personally I'd say that if you have this choice, you're using the wrong card and you'd be better off with an older, more high-end card with DVI-I out.

I was going off the original post in the thread, where it was questioned converting VGA to a digital medium back to VGA. But yes, it does add latency. The amount of latency depends on the type of adapter used.

My Github
Circuit Board Repair Manuals

Reply 8 of 10, by cyclone3d

User metadata
Rank l33t++
Rank
l33t++
imi wrote on 2020-06-14, 12:52:

I've seen a CRT recently with a DVI input ^^

Was it DVI-D or DVI-I ?

If DVI-D, wouldn't that mean that there was a DAC built into the monitor?

Yamaha modified setupds and drivers
Yamaha XG repository
YMF7x4 Guide
Aopen AW744L II SB-LINK

Reply 9 of 10, by maxtherabbit

User metadata
Rank l33t
Rank
l33t

avoid DACs and ADCs wherever possible, if your video card only has analog output use a quality VGA cable (with real 75 ohm mini coax) and don't worry about it

if your target device is a CRT you should be driving it with the video card's DAC and good VGA cable, using an external one is not going to be better - by the time video cards started shipping with digital output connectors the onboard DACs were already high quality

Reply 10 of 10, by imi

User metadata
Rank l33t
Rank
l33t
cyclone3d wrote on 2020-06-14, 15:15:
imi wrote on 2020-06-14, 12:52:

I've seen a CRT recently with a DVI input ^^

Was it DVI-D or DVI-I ?

If DVI-D, wouldn't that mean that there was a DAC built into the monitor?

just looked into it, I think it was an IBM P275, seems to be DVI-A only, what a bummer.

I guess it would have made little sense, as during that time pretty much every card available could still output analog, even if it only had DVI connectors, so why put a DAC into the monitor.