Reply 40 of 123, by spiroyster
- Rank
- Oldbie
For edge-detection, it's important to note that the actual colour of the pixel is not important since it looks at intensity difference of adjacent pixels. DVI is a lot crisper/sharper so will band where VGA doesn't. The pattern is much finer though, and has a distinctinctly different character than the JPG imo (JPG ones are 2B pencil sketches, DVI is 2H!). VGA presents a softer image which cannot reproduce adjacent intensities as sharply as DVI can. In this case, the texture/pattern being analysed is a cloud/noise pattern so many of the adjacent pixels can have dramatic difference, which VGA will struggle to reproduce as a distinct sharp line so its gradient changes will overall be softer and produce a noisy edge-detection result. Loss of signal quality is a bit like free anti-aliasing (Can't turn it off though 🙁).
Personally, I don't think there is anything wrong with the Geforce or Quadro in the shots provided, IQ looks fine. What we see here is the limit of the texture quality itself imo, slightly more natural/anti-aliased/softer rendition through VGA over DVI's synthetic/sharp capabilities. I can see slight banding on your DVI shots, which now that I know about it, can't 'not see it' and it pisses me off ever so slightly. VGA is da way imo... until greater than 1920x1080 is required that is o.0