VOGONS


First post, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

When you're using non-native resolution on LCD screen, say, 640x480 resolution on 1600x1200 monitor, you suffer some blurriness. This becomes a problem from us who play old games, whose resolution doesn't match the monitor's native resolution in many cases.

Some VGA to HDMI converters, on the other hand, offers such optimistic feature as "scaling up any VGA input resolution into the target monitor's native resolution."

So, how good the VGA to HDMI scaler is, compared to a typical monitor's built-in scaler?

Let's consider this example. My video card has both VGA and HDMI output. I'm playing system shock - a 640x480 game - on a 1280x1024 monitor. My monitor has both VGA and HDMI input.

Scenario 1:
video card's VGA out ----> HDMI converter ----> monitor's HDMI in.

Scenario 2:
video card's HDMI out ----> monitor's HDMI in.

Scenario 3:
video card's DVI out ----> monitor's DVI in.

In scenario 1, I let the HDMI converter to scale up the image. In Scenario 2 and 3, I let the monitor's built-in scaler to scale up the image. Which scenario has the least blurry image?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 1 of 29, by laxdragon

User metadata
Rank Member
Rank
Member

#2 & #3 should be the same, both are a digital signal that your monitor upscales. Since it is digital, you will experience no loss of data. In #1, the signal is going from analog to digital, and some of the original data could be lost due to the usual analog signal stuff such as interference.

As far as which will look better, that is subjective. But as far as blurriness goes, you should see no blur in a digital signal, unless your scaler is programed to add some blur.

laxDRAGON.com | My Game Collection | My Computers | YouTube

Reply 2 of 29, by keropi

User metadata
Rank l33t++
Rank
l33t++

I believe that unless you pay a hefty amount for a good scaler you are better off with the monitor's built in one... cheap ones will just add processing lag and quality loss 😵

🎵 🎧 MK1869, PCMIDI MPU , OrpheusII , Action Rewind , Megacard and 🎶GoldLib soundcard website

Reply 3 of 29, by obobskivich

User metadata
Rank l33t
Rank
l33t
laxdragon wrote:

#2 & #3 should be the same, both are a digital signal that your monitor upscales. .

*Should* but not always - some displays treat them as different inputs in terms of scaling options and what resolutions they will report to the attached devices. 😵 A lot of modern LCDs will look very sharp (usually to the point of being hard to discriminate from DVI/HDMI) with VGA inputs, even at high (1600x1200 or higher) resolutions, and for early-DVI era hardware (around 2000-2003) you may actually get a better/more reliable picture with VGA (and you will have higher resolution options) because the TMDS transmitters on those cards tended to be both limited to single-link, and in some cases were barely capable of meeting the bandwidth requirements for that.

As far as the original question: it depends on the graphics card, the game, and the monitor. If the monitor treats all of its inputs the same in terms of resolution/scaling then go with whatever is best common between the monitor and computer and then either use the GPU to scale (ideal) or if it's an old computer, let the monitor scale it (preserving aspect ratio if possible). Modern GPUs will do a great job of scaling their outputs (usually better than what you will find in a monitor). However not all monitors treat their inputs identically - for example I've observed on my Samsung monitors that VGA is a better choice for 4:3 resolutions that are lower than native, as they don't just stretch everything out to fit but will instead preserve the AR automatically (whereas on DVI everything is just stretched to fill the screen).

Cheapo "VGA to HDMI" boxes may incur lots of noise due to cheap internals, bad processing, etc so you should be cognizant there and test against whatever the monitor or computer is capable of doing.

It's also worth considering from what resolution you're starting and where you're going - even with name-brand scaling hardware jumping something like 640x480 up to something like 2560x1600 or 3840x2400 will not look great, but 1024x768 to 1600x1200 shouldn't be the end of the world. Ideally you would be outputting at least in the same AR (sacrificing total resolution to preserve AR if necessary) as whatever the display is doing (even if that means pillar boxed final product), to prevent stretching/distortion (and IME usually 1:1 scaling looks better than having to convert AR as well).

Reply 4 of 29, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

I vote for the monitor. Tried many models and those that scale correctly (with 4:3 aspect ratio) give a nice picture. The image will always be a bit softer, but it's really not a big deal.

Is there any chance of playing this game at a native resolution? 1280 x 1024 for a 19" or 1600 x 1200 for a 24"

You can always get a CRT 😀

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 5 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
laxdragon wrote:

#2 & #3 should be the same, both are a digital signal that your monitor upscales. .

*Should* but not always - some displays treat them as different inputs in terms of scaling options and what resolutions they will report to the attached devices. 😵 A lot of modern LCDs will look very sharp (usually to the point of being hard to discriminate from DVI/HDMI) with VGA inputs, even at high (1600x1200 or higher) resolutions, and for early-DVI era hardware (around 2000-2003) you may actually get a better/more reliable picture with VGA (and you will have higher resolution options) because the TMDS transmitters on those cards tended to be both limited to single-link, and in some cases were barely capable of meeting the bandwidth requirements for that.

As far as the original question: it depends on the graphics card, the game, and the monitor. If the monitor treats all of its inputs the same in terms of resolution/scaling then go with whatever is best common between the monitor and computer and then either use the GPU to scale (ideal) or if it's an old computer, let the monitor scale it (preserving aspect ratio if possible). Modern GPUs will do a great job of scaling their outputs (usually better than what you will find in a monitor). However not all monitors treat their inputs identically - for example I've observed on my Samsung monitors that VGA is a better choice for 4:3 resolutions that are lower than native, as they don't just stretch everything out to fit but will instead preserve the AR automatically (whereas on DVI everything is just stretched to fill the screen).

Well there are cases when a game - usually with unofficial patch - can scale up to modern resolution. WarCraft III and Crimson Skies, for example, can scale up to 1280x1024 internally. MDK, on the other hand, is fixed to 640x480 resolution, and so is System Shock I, where it's highest resolution is 640x480.

All else being equal, which is better between DVI and HDMI?

obobskivich wrote:

Cheapo "VGA to HDMI" boxes may incur lots of noise due to cheap internals, bad processing, etc so you should be cognizant there and test against whatever the monitor or computer is capable of doing.

I see. How many video scaler - VGA to HDMI or otherwise - works better than a monitor's internal scaler? And at what cost, approximately? 😁

obobskivich wrote:

It's also worth considering from what resolution you're starting and where you're going - even with name-brand scaling hardware jumping something like 640x480 up to something like 2560x1600 or 3840x2400 will not look great, but 1024x768 to 1600x1200 shouldn't be the end of the world. Ideally you would be outputting at least in the same AR (sacrificing total resolution to preserve AR if necessary) as whatever the display is doing (even if that means pillar boxed final product), to prevent stretching/distortion (and IME usually 1:1 scaling looks better than having to convert AR as well).

Yes, I always want to preserve AR. Two vertical black bars is always better than stretching/distortion.

Unfortunately, not all scaling are integers. With 800x600 resolution, you can scale up to 1600x1200 (or 1920x1600 with vertical black bars) without losing clarity. Unfortunately, some old games are fixed at 640x480 resolution.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 6 of 29, by Jepael

User metadata
Rank Oldbie
Rank
Oldbie

I don't think there is no single answer. Just try different monitors and/or different separate scalers and/or different settings to get what you think is best.

The bit about VGA to DVI/HDMI conversion with scaling are two completely separate things, just sometimes they exist in one chip.

First there is analog to digital conversion that can sample a 640 pixel by 480 line analog signal at any rate it wants, so in digital domain, there is definitely 480 lines, but the amount of digital pixels could be anything, it might be directly sampled to target amount of pixels like 1920 so it does not need to be scaled at all, or determined that it might be a 640-pixel format and then it will sample 640 pixels and need to be scaled separately. Of course the lines themselves need to be scaled from 480 to 1080 for example.

Then there is the scaler chip. Some are cheap and thus low quality, some are expensive and studio quality.

Usually monitors have a decent scaler, and because non-native resolutions need to be scaled anyway, it does not depend on if the video source is analog VGA or digital DVI/HDMI.

You asked about HDMI/DVI difference, there really is very little difference but sometimes monitors handle the connectors differently.
The physical signals are compatible, just that DVI is only purely 8-bit RGB video while HDMI allows for audio and metadata about transmitted video to be sent, so it can support more than 8-bit and other colorspaces than RGB as well.

For example I have some experience of one Samsung SyncMaster we had at work. It was one of those with 1680x1050 resolution and it also coupled as a TV as it had composite, S-Video and Component connectors as well. If I connected it to PC from the DVI connector, I got pixel perfect native resolution of 1680x1050 but no sound. If I connected it to PC from the HDMI connector, I got sound but the monitor's HDMI port does not advetize the native resolution, so for example 1920x1080 and many other resolutions are available but everything from the PC will be scaled.

And my friend had one monitor that allowed labeling the HDMI input to something like "DVD", "GAME" or "PC". When connected to a Raspberry Pi computer, and every image processing feature turned off from the monitor menu, the picture still looked like it goes through edge enhancement processing and whatnot, which is usually done for video sources, not graphic sources. You had to label the connected HDMI device as PC to get rid of video specific enhancements and see native pure graphics.

Oh and scaling is so much more than doubling/repeating pixels. It does not matter if the scaling ratio is integer or not, it is still against signal theory to just repeat pixels. They are interpolated through a filter, and the higher quality scaler the better/bigger filter it uses. So for example even if you send 1920x1080p to a 4K TV screen and pixels could just be doubled in the TV scaler to 3840x2160p native panel resolution, if you look closely it is scaled through a filter and PC desktop looks somehow blobby. Then again, a 4K monitor could be better than 4K *TV*. I wish there was some setting to disable the scaling filter.

Reply 7 of 29, by obobskivich

User metadata
Rank l33t
Rank
l33t
Kreshna Aryaguna Nurzaman wrote:

Well there are cases when a game - usually with unofficial patch - can scale up to modern resolution. WarCraft III and Crimson Skies, for example, can scale up to 1280x1024 internally. MDK, on the other hand, is fixed to 640x480 resolution, and so is System Shock I, where it's highest resolution is 640x480.

All else being equal, which is better between DVI and HDMI?

Zero difference between DVI and HDMI - both use the same digital signal for video. HDMI can carry audio and lacks the ability to have VGA (DVI-I can carry VGA). Adapting either way is a non-issue, and adapters are generally very cheap (under $10).

Also note that 1280x1024 is not a 4:3 resolution - it's 5:4. If you're generating an image in 4:3 to scale higher I would suggest 1280x960 instead.

I see. How many video scaler - VGA to HDMI or otherwise - works better than a monitor's internal scaler? And at what cost, approximately? 😁

Highly variable - depends on the scaler, number of inputs, resolution I/O needs, etc.

Reply 8 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
Kreshna Aryaguna Nurzaman wrote:

Well there are cases when a game - usually with unofficial patch - can scale up to modern resolution. WarCraft III and Crimson Skies, for example, can scale up to 1280x1024 internally. MDK, on the other hand, is fixed to 640x480 resolution, and so is System Shock I, where it's highest resolution is 640x480.

All else being equal, which is better between DVI and HDMI?

Zero difference between DVI and HDMI - both use the same digital signal for video. HDMI can carry audio and lacks the ability to have VGA (DVI-I can carry VGA). Adapting either way is a non-issue, and adapters are generally very cheap (under $10).

Also note that 1280x1024 is not a 4:3 resolution - it's 5:4. If you're generating an image in 4:3 to scale higher I would suggest 1280x960 instead.

Indeed it's not. Maybe I'll have horizontal black bars when displaying in the correct aspect ratio. Black bars doesn't matter though - it's the loss of clarity that I'm concerned. Even with 1920x1600 with two vertical black bars, 640x480 still needs to be scaled up, and guess I have to deal with loss of clarity, especially since scaling up from 640x480 to 1600x1200 isn't a round numbers-multiplication.

obobskivich wrote:

I see. How many video scaler - VGA to HDMI or otherwise - works better than a monitor's internal scaler? And at what cost, approximately? 😁

Highly variable - depends on the scaler, number of inputs, resolution I/O needs, etc.

Let's say I only want VGA and DVI input, with 1600x1200 max resolution output, and with better quality scaling than a typical monitor's built-in scaler. How much does it cost, approximately?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 9 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
Jepael wrote:

Then there is the scaler chip. Some are cheap and thus low quality, some are expensive and studio quality.

So studio quality scaler is supposed to be better (at least less blurry) than a monitor's built-in scaler, am I correct? Wonder how much one would cost.... 🤣

Jepael wrote:

You asked about HDMI/DVI difference, there really is very little difference but sometimes monitors handle the connectors differently.
The physical signals are compatible, just that DVI is only purely 8-bit RGB video while HDMI allows for audio and metadata about transmitted video to be sent, so it can support more than 8-bit and other colorspaces than RGB as well.

For example I have some experience of one Samsung SyncMaster we had at work. It was one of those with 1680x1050 resolution and it also coupled as a TV as it had composite, S-Video and Component connectors as well. If I connected it to PC from the DVI connector, I got pixel perfect native resolution of 1680x1050 but no sound. If I connected it to PC from the HDMI connector, I got sound but the monitor's HDMI port does not advetize the native resolution, so for example 1920x1080 and many other resolutions are available but everything from the PC will be scaled.

I see. I have a weird experience with a Samsung LED television. When using VGA, my laptop screen is displayed at full resolution. But when using HDMI, the resolution is smaller. Not remember how much smaller, but I remember my desktop area become smaller - with bigger icons and the likes.

Jepael wrote:

And my friend had one monitor that allowed labeling the HDMI input to something like "DVD", "GAME" or "PC". When connected to a Raspberry Pi computer, and every image processing feature turned off from the monitor menu, the picture still looked like it goes through edge enhancement processing and whatnot, which is usually done for video sources, not graphic sources. You had to label the connected HDMI device as PC to get rid of video specific enhancements and see native pure graphics.

Oh and scaling is so much more than doubling/repeating pixels. It does not matter if the scaling ratio is integer or not, it is still against signal theory to just repeat pixels. They are interpolated through a filter, and the higher quality scaler the better/bigger filter it uses. So for example even if you send 1920x1080p to a 4K TV screen and pixels could just be doubled in the TV scaler to 3840x2160p native panel resolution, if you look closely it is scaled through a filter and PC desktop looks somehow blobby. Then again, a 4K monitor could be better than 4K *TV*. I wish there was some setting to disable the scaling filter.

I see.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 11 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
Mau1wurf1977 wrote:

TVs are different because there is overscan. You need to disable overscan in the TV and you need to disable underscan in the Video driver 😀

Huh? LED TV still does overscan? I thought overscan is CRT-thing.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 12 of 29, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++
Kreshna Aryaguna Nurzaman wrote:

Huh? LED TV still does overscan? I thought overscan is CRT-thing.

Yup. Most TVs have a menu option though to change it to 1:1 pixel mapping. Best is to consult the manual.

And video cards underscan to compensate. Nvidia and AMD have a slider in the driver to adjust this. Set it to 0%.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 13 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
Mau1wurf1977 wrote:
Kreshna Aryaguna Nurzaman wrote:

Huh? LED TV still does overscan? I thought overscan is CRT-thing.

Yup. Most TVs have a menu option though to change it to 1:1 pixel mapping. Best is to consult the manual.

And video cards underscan to compensate. Nvidia and AMD have a slider in the driver to adjust this. Set it to 0%.

I see, thanks! So if you disable overscan on LED TV, and disable underscan on the video card, then the TV will be just like a computer monitor, right?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 15 of 29, by obobskivich

User metadata
Rank l33t
Rank
l33t
Kreshna Aryaguna Nurzaman wrote:

Indeed it's not. Maybe I'll have horizontal black bars when displaying in the correct aspect ratio. Black bars doesn't matter though - it's the loss of clarity that I'm concerned. Even with 1920x1600 with two vertical black bars, 640x480 still needs to be scaled up, and guess I have to deal with loss of clarity, especially since scaling up from 640x480 to 1600x1200 isn't a round numbers-multiplication.

I've never had issues with non-round-numbers multiplication as long as it's square pixels and you aren't trying to generate 99% of the image through interpolation (e.g. scaling an 80x24 term window up to an IBM T220). You will, likely, still notice the image is not as sharp as if it were natively drawn, but that's to be expected (if you had a low resolution source and put it onto a big low resolution display it would also have "loss of clarity" unless you sat pretty far back).

Highly variable - depends on the scaler, number of inputs, resolution I/O needs, etc.

Let's say I only want VGA and DVI input, with 1600x1200 max resolution output, and with better quality scaling than a typical monitor's built-in scaler. How much does it cost, approximately?[/quote]

Hundreds probably, especially if you're wanting to support non-TV resolutions (because it means you can't go cheap and use an AVR or somesuch). To buy a nice-new DVDO or similar starts around $400. 😊

Kreshna Aryaguna Nurzaman wrote:

So studio quality scaler is supposed to be better (at least less blurry) than a monitor's built-in scaler, am I correct? Wonder how much one would cost.... 🤣

I'd love to know what's meant by "studio quality" and how that's quantitatively/objectively defined first... 😵 (that's quickly becoming one of those meaningless phrases like "paradigm shifting" or "professional grade" - I blame Dr Dre)

Any sort of stand-alone scaler, even expensive DVDO/DCDi/JVC/etc boxes (which are mostly designed with projectors in mind) are designed for playback, not production - if video is being re-rendered at a different resolution (e.g. post production) that's done in software on a computer (or more likely a farm of computers, because nobody has time to wait for a single machine to complete it's magical journey through time), or film itself can be enlarged through optical projection (and of course, there are limits in both situations where things "break down" and you just get a mess; the software is essentially replacing what an enlarger did years ago).

A proper square-pixel input should not look blurry on an LCD, unless the signal itself is blurry (e.g. low-fi ancient FMVs) - even old games should still look "crisp" being scaled, but of course won't have the same quality as a modern "HD" game with very high resolution textures, lighting, etc. The only thing I can think of that would produce a blurry image would be if the monitor isn't auto-adjusting to the analog input like it should (and the end result is an image that looks like the registration (I'm sure there's a proper term when referencing monitors, but I don't know it) is off - or "blurry"). A lot of newer monitors will automatically auto-adjust - that is, whenever the input via VGA changes or re-syncs they will re-adjust to help mediate this, whereas older monitors tend to require manual control, such that if you change resolutions they may not have a perfect picture until you recycle the adjustment.

I see. I have a weird experience with a Samsung LED television. When using VGA, my laptop screen is displayed at full resolution. But when using HDMI, the resolution is smaller. Not remember how much smaller, but I remember my desktop area become smaller - with bigger icons and the likes.

It's likely the TV (more specifically the panel) isn't actually delivering whatever resolution was stamped on the box - so say it says 1920x1080, it can take 1920x1080 as an input (via HDMI/DVI/whatever) but the panel is say only 1024x768 (and believe it or not there are TVs that really exist like this - modern HDTVs are notorious for out and out lying in their specs). It probably exposes the "real" resolution via the VGA connector, either because they were too cheap to spring for digitization of analog input into the scaler, or because it transmits honest EDID for VGA on the assumption it's going to hook up to a PC and nothing else (the reasoning for showing itself as a 1920x1080 device via HDMI, even if the panel isn't that resolution, is because source devices are designed on the assumption that they only have to drive a small handful of modes, instead of a PC which probably has a display modelist a mile long).

Mau1wurf1977's point about over/underscan is also valid depending on the TV, computer, etc and should be investigated as well (and is yet another reason that "TVs as monitors" can be a pain in the neck).

Reply 16 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

Thank you for the reply, people. Many interesting points being made, many new things to learn. I bookmarked this thread for future reference.

For now, it seems a monitor's built-in scaler should work fine.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 17 of 29, by obobskivich

User metadata
Rank l33t
Rank
l33t
Kreshna Aryaguna Nurzaman wrote:

For now, it seems a monitor's built-in scaler should work fine.

Honestly I'd be inclined to say "it should" as long as the monitor is new-ish (so you aren't using your LCD from 2001 or something), and actually has some sort of scaler (some super-high resolution (2560x1600 or above) monitors have no scaler, but they usually say so in their manual). 😀

Reply 18 of 29, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
Kreshna Aryaguna Nurzaman wrote:

For now, it seems a monitor's built-in scaler should work fine.

Honestly I'd be inclined to say "it should" as long as the monitor is new-ish (so you aren't using your LCD from 2001 or something),

It's Dell P1914S, actually. And nope, this time I'm not paying humongous S/H cost since I bought the item locally. 🤣

obobskivich wrote:

and actually has some sort of scaler (some super-high resolution (2560x1600 or above) monitors have no scaler, but they usually say so in their manual). 😀

Huh? But why?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 19 of 29, by Holering

User metadata
Kreshna Aryaguna Nurzaman wrote:

So, how good the VGA to HDMI scaler is, compared to a typical monitor's built-in scaler?

If (and only if) it does oversampling, can it look better. I've never seen a monitor or GPU, or even software that does oversampling. If a scaler does oversampling, you should get better results.

Let me explain a little why oversampling can make non-native resolutions, look almost native (in some cases transparent). Oversampling upscales the original resolution to whatever multiple it's capable of with nearest neighbor pixels (e.g. 320x200 oversampled to 10X, becomes a native 3200x2000 without filtering). At that point, it resamples (downscales) with a high quality filter to achieve high quality results, at the target resolution (e.g. 3200x2000 gets downscaled to 1680x1050 with lanczos resampling). So basically if the scaler does 10x oversampling and you want 640x480 to fill up your 1600x1200 native display, you'd get: 640X480*10 becomes 6400X4800 which gets downscaled to 1600x1200. Higher oversampling gives better results (more accurate).

I think one of the biggest setbacks with digital displays (besides oversampling) is lack of overscan and underscan options. What if a user wants 320x240 pixels to fill up a native 640x400 screen? He/she should have the option to overscan 80 pixels if he/she wants. You might have 80 pixels shaved off, but at least you'll have a razor sharp screen without filtering when scaling. This is easily possible, and takes much less processing than scaling. Heck, sometimes two pixels can make the hugest difference between razor sharp, and ugly interpolation.