Reply 20 of 23, by VileR
- Rank
- l33t
wrote:wrote:'Blurry' (bilinear/trilinear/etc.) interpolation doesn't smooth out curves in any case - it softens the edges of every sampled point, which is a completely different thing.
Technically, yes. Perceptually, no, which is the important thing when discussing image quality.
For me there's quite a difference perceptually. If I play an older 3D game these days, my preference is to use a newer source port if one is available, and dial it up to my monitor's native resolution. When it comes to lines and curves, the benefit is that they're ultra-sharp and fine - with blurry interpolation, the effect I get is the opposite. Of course nearest neighbor scaling can't achieve that either, because it retains the quantized aliasing of the lower-res original, but when native resolution isn't available I don't see myself preferring the blur in any way, at least not the common implementations I tend to see.
wrote:You're right, but that wasn't the point I was sarcastically making here. It was that if you prioritize sharpness above all else, then that's not going to get you good image quality. You have to consider for scaling algorithms what would produce a better image, not what would produce more sharpness.
"Better" is a function of your use-case. For instance, If I want to enlarge a photographic still image, I probably wouldn't use nearest-neighbor even at integer factors. For a 3D game however, which (a) was rendered as an array of discrete pixels in the first place and (b) is full of constantly-moving detail, I can't be convinced that a constant blur factor applied to the resulting 2D surface (as opposed to say, individual textures) is better. And that's besides the fact that I find it subjectively unpleasant and disorienting; there's a reason people get prescription glasses when they need them.
wrote:Not that even your technically correct way of scaling the image would do what the OP wants: replicate a native resolution picture. Because that sharpness also comes from the pixel structure of the display. CRTs, for all their praise about being native resolution all the time basically do the same thing as a blurry upscaling on a high DPI monitor: adds a ton of image information that wasn't actually there.
Yes, and for games that were released in the CRT era you could make the case that this was at least the expected (if not intended) appearance. The thing is that the effect of a CRT display isn't really similar to bi-/tri-/*-linear interpolated scaling. In the vertical domain, scanlines make sure that pixels are even *more* cleanly separated from each other, not less. And while they do get blended horizontally (and even vertically at higher resolutions, if the beam isn't fine enough), the blending is additive rather than averaging.
If someone really wanted to go that route, those CRT simulation shaders/filters are pretty good these days and do take care to do it properly. The problem is that they require very high scaling factors to do their job well...