VOGONS

Common searches


Reply 20 of 23, by VileR

User metadata
Rank l33t
Rank
l33t
ZellSF wrote:
VileRancour wrote:

'Blurry' (bilinear/trilinear/etc.) interpolation doesn't smooth out curves in any case - it softens the edges of every sampled point, which is a completely different thing.

Technically, yes. Perceptually, no, which is the important thing when discussing image quality.

For me there's quite a difference perceptually. If I play an older 3D game these days, my preference is to use a newer source port if one is available, and dial it up to my monitor's native resolution. When it comes to lines and curves, the benefit is that they're ultra-sharp and fine - with blurry interpolation, the effect I get is the opposite. Of course nearest neighbor scaling can't achieve that either, because it retains the quantized aliasing of the lower-res original, but when native resolution isn't available I don't see myself preferring the blur in any way, at least not the common implementations I tend to see.

ZellSF wrote:

You're right, but that wasn't the point I was sarcastically making here. It was that if you prioritize sharpness above all else, then that's not going to get you good image quality. You have to consider for scaling algorithms what would produce a better image, not what would produce more sharpness.

"Better" is a function of your use-case. For instance, If I want to enlarge a photographic still image, I probably wouldn't use nearest-neighbor even at integer factors. For a 3D game however, which (a) was rendered as an array of discrete pixels in the first place and (b) is full of constantly-moving detail, I can't be convinced that a constant blur factor applied to the resulting 2D surface (as opposed to say, individual textures) is better. And that's besides the fact that I find it subjectively unpleasant and disorienting; there's a reason people get prescription glasses when they need them.

ZellSF wrote:

Not that even your technically correct way of scaling the image would do what the OP wants: replicate a native resolution picture. Because that sharpness also comes from the pixel structure of the display. CRTs, for all their praise about being native resolution all the time basically do the same thing as a blurry upscaling on a high DPI monitor: adds a ton of image information that wasn't actually there.

Yes, and for games that were released in the CRT era you could make the case that this was at least the expected (if not intended) appearance. The thing is that the effect of a CRT display isn't really similar to bi-/tri-/*-linear interpolated scaling. In the vertical domain, scanlines make sure that pixels are even *more* cleanly separated from each other, not less. And while they do get blended horizontally (and even vertically at higher resolutions, if the beam isn't fine enough), the blending is additive rather than averaging.

If someone really wanted to go that route, those CRT simulation shaders/filters are pretty good these days and do take care to do it properly. The problem is that they require very high scaling factors to do their job well...

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 21 of 23, by ZellSF

User metadata
Rank l33t
Rank
l33t
VileRancour wrote:

And that's besides the fact that I find it subjectively unpleasant and disorienting; there's a reason people get prescription glasses when they need them.

While it might not be very relevant to the point you were making, I still want to point out a technical (sort of difference), since well you did that to me earlier. A lot of people get prescription glasses because they actually can't make out details at all, which isn't the case when talking about the difference between scaling algorithms.

Yes I know some people get glasses because of eye strain issues too, but I'm not sure that's that linked to differences between scaling algorithms either.

VileRancour wrote:

The thing is that the effect of a CRT display isn't really similar to bi-/tri-/*-linear interpolated scaling.

That's a weirdly specific statement. A CRT display isn't similar to integer scaling either, why not include that in the list?

VileRancour wrote:

If someone really wanted to go that route, those CRT simulation shaders/filters are pretty good these days and do take care to do it properly. The problem is that they require very high scaling factors to do their job well...

That's far from the only problem, the biggest problem is there isn't a whole lot of ways to apply CRT shaders to PC games and a the ones that exist are problematic.

ReShade has a CRT shader, but it needs the game to be DX9+/OpenGL and it can only work as a post-processing effect, meaning a 640x480 game would only have 640x480 to create that CRT effect (hint: it doesn't work) and if you want to prescale the image with some other software (which has all sorts of issues on its own), the CRT shader isn't aware of what you've done and can't work properly.

Other than that we have what, GeDoSaTo which is D3D9 only and a CRT shader with zero customizeability in dgVoodoo2.

It's not great and there isn't even a single solution to simulate a low resolution (1920x1080 and below) LCD monitor yet, which a lot of games you would have "problems" with on a 1440p monitor was designed for. Not that a 1440p display would be adequate for even the crudest approximation of a low resolution monitor.

VileRancour wrote:

"Better" is a function of your use-case

Subjectively, of course. But there are objective criteria too, like as I said, what looks the closes to how the image is meant to look. Plenty of people prefer to deviate from that, me included (eye strain means I have adjusted brightness/contrast so any really bright scene looks rather dull).

Reply 22 of 23, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote:
What? No. The others are fine, but a 1.5x scale gives you no advantages whatsoever. Also worth noting that 16x9 480p is slightly […]
Show full quote
SPBHM wrote:

1440 SHOULD be a good res to scale, 2x 720, 3x 480, 1.5x 960

What? No. The others are fine, but a 1.5x scale gives you no advantages whatsoever. Also worth noting that 16x9 480p is slightly problematic (2560 / 3 doesn't work). Not a big deal if you can use custom resolutions though.

SPBHM wrote:

but who knows if the monitor is going to scale it properly or just blur everything (more likely)

What's the proper scaling method is dependent on the content (as explained "blur everything" is pretty nice for a lot of content) and I don't think there's a single monitor that allows you to change scaling modes to suit content.

Which is why I use GPU scaling + software based scaling to get some flexibility, but since OP's using XP that's very limiting to his options.

have you tried 1.5x? looks pretty good to me, not 2x but much better than the regular blurry scaling you get;

for me proper scaling is handling 800x600 in a 1600x1200 screen as 2x without typical bilinear filtering or whatever they do to make the image look blurry, it's OK for video, but not my preferred method for gaming, but the nice thing is always to have options and not just 1 method.

also in fresh news, Nvidia is adding integer scaling to their driver (Intel announced they are doing the same a few months back)
https://www.techpowerup.com/258440/nvidia-pre … raphics-drivers

for windows 10 borderless windowed this already worked OK
https://store.steampowered.com/app/993090/Lossless_Scaling/

Reply 23 of 23, by ZellSF

User metadata
Rank l33t
Rank
l33t
SPBHM wrote:
ZellSF wrote:
What? No. The others are fine, but a 1.5x scale gives you no advantages whatsoever. Also worth noting that 16x9 480p is slightly […]
Show full quote
SPBHM wrote:

1440 SHOULD be a good res to scale, 2x 720, 3x 480, 1.5x 960

What? No. The others are fine, but a 1.5x scale gives you no advantages whatsoever. Also worth noting that 16x9 480p is slightly problematic (2560 / 3 doesn't work). Not a big deal if you can use custom resolutions though.

SPBHM wrote:

but who knows if the monitor is going to scale it properly or just blur everything (more likely)

What's the proper scaling method is dependent on the content (as explained "blur everything" is pretty nice for a lot of content) and I don't think there's a single monitor that allows you to change scaling modes to suit content.

Which is why I use GPU scaling + software based scaling to get some flexibility, but since OP's using XP that's very limiting to his options.

have you tried 1.5x? looks pretty good to me, not 2x but much better than the regular blurry scaling you get;

I don't particularly have to, as I pointed out there's nothing magic about a 1.5x scale. You'll get the same inconsistencies you get from any non-integer scale.

But I did try now (because making theoretical arguments without verifying when I can is a bit dumb) and yup: looks awful. Scaling 1280x960 to 3840x2160 looked cleaner, despite supposedly being a worse scale factor if you go by the argument that *.5x scale factors are somehow better.

SPBHM wrote:

also in fresh news, Nvidia is adding integer scaling to their driver

For Turing cards only. Not that I've ever found a game that needs it there isn't an existing integer scaling solution for.