VOGONS

Common searches


Reply 20 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

Hmmm...

Some A/V receivers can up-convert its analog video signal into 1080p HDMI out.

So, which one will be less blurry?

Scenario 1:
[GTX 280] ---direct DVI or HDMI---> [Monitor]

Scenario 2:
[GTX 280] ---DVI to VGA---> ---VGA to S-Video---> [A/V Receiver] ---HDMI---> [Monitor]

Yup, between the video card and the A/V Receiver, there is a convoluted conversion from VGA into S-Video, but then the A/V receiver will upscale the analog video signal into 1080p and then send the resulting digital video signal to the monitor via HDMI.

Last edited by Kreshna Aryaguna Nurzaman on 2014-07-26, 12:54. Edited 1 time in total.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 22 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
Mau1wurf1977 wrote:

Is this a trick question?

1 of course!

It's not a trick question; I just wonder how good a receiver's upscale conversion is. Is it good enough to outweigh the signal degradation from converting DVI out to VGA and then to S-Video?

Too bad no AV receiver has direct VGA input, thus avoiding the conversion from VGA to S-Video.

Has anyone ever tried such convoluted experiment before? Just out of curiosity? I don't really mind convoluted configuration as long as it yelds the best result.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 23 of 29, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++
Kreshna Aryaguna Nurzaman wrote:

Is it good enough to outweigh the signal degradation from converting DVI out to VGA and then to S-Video?

Technically impossible...

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 24 of 29, by obobskivich

User metadata
Rank l33t
Rank
l33t
Mau1wurf1977 wrote:
Kreshna Aryaguna Nurzaman wrote:

Is it good enough to outweigh the signal degradation from converting DVI out to VGA and then to S-Video?

Technically impossible...

Agreed. It's the conversion to S-Video that will destroy your signal (especially since s-video is very resolution limited). Going from YPbPr at high resolution (like 1080p) and letting the receiver do an A/D conversion wouldn't be a bad connection though - similar to VGA into your LCD monitor (although the AVR may give you more control/adjustment options).

Reply 25 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
Mau1wurf1977 wrote:
Kreshna Aryaguna Nurzaman wrote:

Is it good enough to outweigh the signal degradation from converting DVI out to VGA and then to S-Video?

Technically impossible...

Agreed. It's the conversion to S-Video that will destroy your signal (especially since s-video is very resolution limited). Going from YPbPr at high resolution (like 1080p) and letting the receiver do an A/D conversion wouldn't be a bad connection though - similar to VGA into your LCD monitor (although the AVR may give you more control/adjustment options).

I see, thanks!

Problem is, even if I'm using YPbPr instead of S-Video, I'll start from low resolution anyway - 640x480 and the likes. 😁

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 26 of 29, by obobskivich

User metadata
Rank l33t
Rank
l33t

YPbPr will have much better color separation/definition than S-video, so it would be worth setting it up that way vs S-video if possible (although an old graphics card with YPbPr may not be as easily found as one with S-video). Colorspace converters exist to translate VGA (RGB) to YPbPr, which may be a reasonable consideration if you want to go through a receiver.

Reply 27 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:

YPbPr will have much better color separation/definition than S-video, so it would be worth setting it up that way vs S-video if possible (although an old graphics card with YPbPr may not be as easily found as one with S-video). Colorspace converters exist to translate VGA (RGB) to YPbPr, which may be a reasonable consideration if you want to go through a receiver.

Interesting, and it may be worth trying. 😀

Anyway, would an Atlona VGA to HDMI scaler yield better result than (a) converting VGA to YPbPr, then (b) upscaling and converting YPbPr to HDMI on a receiver?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 28 of 29, by obobskivich

User metadata
Rank l33t
Rank
l33t
Kreshna Aryaguna Nurzaman wrote:

Anyway, would an Atlona VGA to HDMI scaler yield better result than (a) converting VGA to YPbPr, then (b) upscaling and converting YPbPr to HDMI on a receiver?

That's a trickier question - it depends on how good the Atlona is at A/D conversion and scaling I would guess. Newer graphics cards will generally be able to output YPbPr natively so this would only be a relevant discussion for an old card that would need conversion one way or another, and then it would just come down to which device had the better conversion quality - the VGA to YPbPr adapter or the Atlona HDMI adapter.

Reply 29 of 29, by Holering

User metadata
Kreshna Aryaguna Nurzaman wrote:

Anyway, would an Atlona VGA to HDMI scaler yield better result than (a) converting VGA to YPbPr, then (b) upscaling and converting YPbPr to HDMI on a receiver?

Yes. If you want the cleanest signal, you need the least conversions possible.

EDIT:
You still have the issue of figuring out what does oversampling. If you don't want blurriness, you need oversampling.