infiniteclouds wrote on 2020-06-14, 06:48:
Does the conversion cause degradation, itself though? I thought it was the signal passing through that caused the degradation -- so a longer cable would have more signal loss than a shorter one?
Yes, the AD conversion is not a perfect process . This is a complex process to explain (and mostly outside my field of expertise), but comes down to the inevitable quantization error and the fact that neither the ADC or DAC used are going to be perfect . This explains the concept, to a point . https://www.designnews.com/content/oversampli … /57569490133987
That said, there are 3 points to consider.
a) Whether transmitting an analogue VGA signal over 6 feet to a CRT using high quality cable causes any perceptible degradation compared to using a theoretical zero-length cable .
b) Whether converting an analogue VGA signal to digital and back to analogue again causes any perceptible degradation compared to staying in the analogue domain and using a theoretical zero-length cable
c) which of a) or b) is most noticeable, which would take double-blind to determine .
This brings is back to the fundamental question of "Is there really a problem to be solved here ?"
My uneducated guess is that there isn't and that a 6 foot run through a high quality VGA cable is not going to have a signficant impact on picture quality (especially on a CRT) .