Kreshna Aryaguna Nurzaman wrote:
Indeed it's not. Maybe I'll have horizontal black bars when displaying in the correct aspect ratio. Black bars doesn't matter though - it's the loss of clarity that I'm concerned. Even with 1920x1600 with two vertical black bars, 640x480 still needs to be scaled up, and guess I have to deal with loss of clarity, especially since scaling up from 640x480 to 1600x1200 isn't a round numbers-multiplication.
I've never had issues with non-round-numbers multiplication as long as it's square pixels and you aren't trying to generate 99% of the image through interpolation (e.g. scaling an 80x24 term window up to an IBM T220). You will, likely, still notice the image is not as sharp as if it were natively drawn, but that's to be expected (if you had a low resolution source and put it onto a big low resolution display it would also have "loss of clarity" unless you sat pretty far back).
Highly variable - depends on the scaler, number of inputs, resolution I/O needs, etc.
Let's say I only want VGA and DVI input, with 1600x1200 max resolution output, and with better quality scaling than a typical monitor's built-in scaler. How much does it cost, approximately?[/quote]
Hundreds probably, especially if you're wanting to support non-TV resolutions (because it means you can't go cheap and use an AVR or somesuch). To buy a nice-new DVDO or similar starts around $400. 😊
Kreshna Aryaguna Nurzaman wrote:
So studio quality scaler is supposed to be better (at least less blurry) than a monitor's built-in scaler, am I correct? Wonder how much one would cost.... 🤣
I'd love to know what's meant by "studio quality" and how that's quantitatively/objectively defined first... 😵 (that's quickly becoming one of those meaningless phrases like "paradigm shifting" or "professional grade" - I blame Dr Dre)
Any sort of stand-alone scaler, even expensive DVDO/DCDi/JVC/etc boxes (which are mostly designed with projectors in mind) are designed for playback, not production - if video is being re-rendered at a different resolution (e.g. post production) that's done in software on a computer (or more likely a farm of computers, because nobody has time to wait for a single machine to complete it's magical journey through time), or film itself can be enlarged through optical projection (and of course, there are limits in both situations where things "break down" and you just get a mess; the software is essentially replacing what an enlarger did years ago).
A proper square-pixel input should not look blurry on an LCD, unless the signal itself is blurry (e.g. low-fi ancient FMVs) - even old games should still look "crisp" being scaled, but of course won't have the same quality as a modern "HD" game with very high resolution textures, lighting, etc. The only thing I can think of that would produce a blurry image would be if the monitor isn't auto-adjusting to the analog input like it should (and the end result is an image that looks like the registration (I'm sure there's a proper term when referencing monitors, but I don't know it) is off - or "blurry"). A lot of newer monitors will automatically auto-adjust - that is, whenever the input via VGA changes or re-syncs they will re-adjust to help mediate this, whereas older monitors tend to require manual control, such that if you change resolutions they may not have a perfect picture until you recycle the adjustment.
I see. I have a weird experience with a Samsung LED television. When using VGA, my laptop screen is displayed at full resolution. But when using HDMI, the resolution is smaller. Not remember how much smaller, but I remember my desktop area become smaller - with bigger icons and the likes.
It's likely the TV (more specifically the panel) isn't actually delivering whatever resolution was stamped on the box - so say it says 1920x1080, it can take 1920x1080 as an input (via HDMI/DVI/whatever) but the panel is say only 1024x768 (and believe it or not there are TVs that really exist like this - modern HDTVs are notorious for out and out lying in their specs). It probably exposes the "real" resolution via the VGA connector, either because they were too cheap to spring for digitization of analog input into the scaler, or because it transmits honest EDID for VGA on the assumption it's going to hook up to a PC and nothing else (the reasoning for showing itself as a 1920x1080 device via HDMI, even if the panel isn't that resolution, is because source devices are designed on the assumption that they only have to drive a small handful of modes, instead of a PC which probably has a display modelist a mile long).
Mau1wurf1977's point about over/underscan is also valid depending on the TV, computer, etc and should be investigated as well (and is yet another reason that "TVs as monitors" can be a pain in the neck).