VOGONS

Common searches


Reply 20 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t

The theory behind avoiding anything but native resolution is very obvious due to LCD technology with fixed pixels.

No matter what process the GPU uses to produce the image, at the end it outputs a matrix of pixels of a given size to the screen; at that point, if the size of the image matrix does not match the size of the LCD pixel matrix, the screen has to interpolate, which means that the final image you see is not what the GPU sent, but rather one where additional interpolation artifacts were introduced.

This is most evident when working with text when crispness is paramount; it is less evident, and less important, when dealing with games or movies, since there are inherent interpolations and smoothing introduced during the process of image generation, so yet another level introduced by LCD is typically not going to be destructive.

The theory behind integer scaling is also obvious - the LCD can simply display every logical pixel in a group of 4 (or 9, or 16) physical pixels, thus avoiding any interpolation artifacts. However, the theory breaks down when subpixel rendering is used (which is almost all the time when working with text on modern operating systems).

I don't know whether subpixel rendering applies to games/movies and to what extend, but since the problem is less acute there to begin with, it may not be a dealbreaker in any case.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 21 of 54, by BeginnerGuy

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote:
There is a reason to stick to native resolution, but there is no reason to stick to other common resolutions is what I meant. […]
Show full quote
BeginnerGuy wrote:
ZellSF wrote:

Not a typo. There's no reason to stick to common resolutions on LCDs.

Why is that? I prefer native 1440 over interpolated 1800P,

There is a reason to stick to native resolution, but there is no reason to stick to other common resolutions is what I meant.

At any rate a 1800p image scaled to 2160p still has more detail than a native 1440p image. Not saying you're wrong in preferring a lower resolution image of course.

BeginnerGuy wrote:

What does it have to do with AAA titles?

You can't play the latest AAA titles at 4K, which is why people are saying 4K is stupid, but there's more to games than that as you mention. Older games or indie titles where you can easily play in 4K. Yes I'll admit I lost the thread of the conversation here, we might have been replying to different things.

BeginnerGuy wrote:

I perceive the difference, which goes right back to why I said YMMV. I can see it, many others can see it. Also, if the resolution of the monitor and the textures were high enough, why would you need interpolated scaling? The point of the highest resolutions is to make aliasing imperceptible, is it not?

If your hardware isn't good enough or your software doesn't support the higher resolution. Once you're not going for native resolution then what's important is going as high resolution as you can while not sacrificing anything else.

I'm hoping you're not understanding what you're replying to here though, are you saying you could see the difference between a interpolated 1080p>2160p scale and an interpolated 1200p>2160p scale and think the 1080p>2160 scale will be cleaner because of a better scaling ratio? I don't believe you (as for the many others, it's the first time I've heard anyone claim this and I really like discussions like this).

Only reason to stick to perfect scaling ratios is for integer scaling (which is not what I'm talking about above). Which is nice when you have pixel art based games and awful when you have low-res (1080p) 3D rendered games.

Hmm no, you'll have to forgive me as my head is buried in a thick fog of flu medicine.. It seems we agree on pretty much everything and somehow I had it backwards.

- I always argue that when 4k + 60FPS(and higher) is affordable and viable, all arguments against 4k will vanish. So yep, I agree with you about AAA titles, it's a beautiful resolution to play at once you remove cost (and refresh rates which will come soon enough). In fact, what I was saying to OP originally was that even at 27" I would look forward to resolutions above 4k, possibly even as high as 8k *may* be noticeable to me.

As for the scaling I think I had even mentioned before that on many of the 4k monitors I've seen, even 1080P scales up poorly WITH pixel art because it's not doing the integer scaling it could be doing 4:1, there still seems to be interpolation happening. On my LG, the moment you tap the resolution to anything under 3840x2160, there is an immediate and extremely noticeable loss in clarity that worsens the further you drop, I'm not sure of the proper term for it, it's almost like when I'm at the eye doctor and they flick one lens too far on the phoropter and everything goes from sharp back to a blurry. 1080P doesn't look like native 1080P to me though in theory it could.

What I was trying to say is that native looks better to me on any monitor. I have right in this house a 1080P 144hz monitor, 1440P 60hz (IPS type), 4k 60hz (IPS), all 27". My 1080P monitor is ugly in general (older TN high refresh) so I'll leave that out. At 1440p, side by side with a 4k monitor, the 1440 native monitor looks far sharper to me, the difference is significant.

Maybe where you don't believe me (and it could just be placebo) is that I will generally choose the 1440p native monitor over 1800P on my 4k monitor. Admittedly there could be variables to that because I'm able to use AA and higher texture quality and lighting at that resolution, though I do have it in my head that it just plain looks better. I'm quite convinced it does, but maybe a higher end 4k monitor would scale better - I have a "budget" LG 27UD58P, the cheapest 4k IPS I could find by far.

Sup. I like computers. Are you a computer?

Reply 22 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t
BeginnerGuy wrote:

At 1440p, side by side with a 4k monitor, the 1440 native monitor looks far sharper to me, the difference is significant.

Well obviously. They're both running at the same resolution, but one is running at native resolution which is an advantage. I never said native resolution wasn't an advantage, I'm saying it's much less of an advantage than the significant resolution difference between 1440p and 1800p.

BeginnerGuy wrote:

Admittedly there could be variables to that because I'm able to use AA and higher texture quality and lighting at that resolution

Applying different levels of antialiasing would screw up the comparison.

BeginnerGuy wrote:

though I do have it in my head that it just plain looks better.

It can subjectively look better even if it's objectively lower quality.

dr_st wrote:

The theory behind integer scaling is also obvious - the LCD can simply display every logical pixel in a group of 4 (or 9, or 16) physical pixels, thus avoiding any interpolation artifacts. However, the theory breaks down when subpixel rendering is used (which is almost all the time when working with text on modern operating systems).

I don't know whether subpixel rendering applies to games/movies and to what extend, but since the problem is less acute there to begin with, it may not be a dealbreaker in any case..

Well it's easy to test how a 3D game would look integer scaled vs an interpolated scale:

nearest.png
Filename
nearest.png
File size
722.65 KiB
Views
874 views
File license
Fair use/fair dealing exception
lanczos.png
Filename
lanczos.png
File size
1.6 MiB
Views
874 views
File license
Fair use/fair dealing exception

The integer scale image might be sharper but if you're close enough to see the difference, you're likely also close enough to see all the terrible aliasing.

That's a more ideal interpolated scale (lanczos) than you're getting out of current AMD/Nvidia drivers of course, but you're not getting integer scaling out of them either. Puzzling enough there's a huge movement asking for integer scaling in AMD/Nvidia drivers and basically no one asking for better interpolated scaling algorithms.

Integer scaling is as you mention better for 2D games, but there's already plenty of workarounds for integer scaling those (dgVoodoo, DxWnd, IntegerScaler).

Reply 23 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t
ZellSF wrote:

The integer scale image might be sharper but if you're close enough to see the difference, you're likely also close enough to see all the terrible aliasing.

It should not be one or the other. You can use any kind of interpolation / AA algorithms in the GPU, but still want to avoid interpolation by the LCD, by choosing a resolution that is either native or an integer quotient of the native.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 24 of 54, by Scali

User metadata
Rank l33t
Rank
l33t
dr_st wrote:

but still want to avoid interpolation by the LCD, by choosing a resolution that is either native or an integer quotient of the native.

I think the only way you're not getting interpolation out of the LCD is if your image is already 'pixel-doubled' before you send it to the LCD.
As in, you can send a 1080p image to a 2160p screen, but then it will interpolate every second pixel, presumably with bilinear interpolation.
If however you send a 540p image, but pixel-double it first to 1080p, then effectively the bilinear interpolation becomes a 'nop', because it interpolates between two pixels that are equal.
But in my opinion it would be better to just handle it entirely on the GPU and send it out as native, to avoid the LCD's scaling algorithm altogether, and remove another unknown from the equation.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 25 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t

It was an example image to show exactly what happens you integer scale a low resolution (and 1080p is low resolution). You see the aliasing very clearly. I can see it clearly on a 4k@27" display.

In a more extreme example, you can integer scale 240p to 1200p, but the base resolution isn't enough to get perfectly smooth edges. 1080p isn't a good enough resolution for that either. The pixel structure of a 1080p native panel just helps, letting your mind fill in the blanks, sort of like using scanlines for 240p titles.

A 1080p 3D rendered scene needs interpolation of some kind to be accurate (to the object displayed; not to the original 1080p image) on a 4K monitor. Integer scaling isn't a solution.

Scali wrote:
I think the only way you're not getting interpolation out of the LCD is if your image is already 'pixel-doubled' before you send […]
Show full quote

I think the only way you're not getting interpolation out of the LCD is if your image is already 'pixel-doubled' before you send it to the LCD.
As in, you can send a 1080p image to a 2160p screen, but then it will interpolate every second pixel, presumably with bilinear interpolation.
If however you send a 540p image, but pixel-double it first to 1080p, then effectively the bilinear interpolation becomes a 'nop', because it interpolates between two pixels that are equal.
But in my opinion it would be better to just handle it entirely on the GPU and send it out as native, to avoid the LCD's scaling algorithm altogether, and remove another unknown from the equation.

Yes, GPU drivers pretty much always do an interpolated scale, regardless of whether or not it is an integer scaling ratio. This is better for a majority of content.

Reply 26 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t
ZellSF wrote:

It was an example image to show exactly what happens you integer scale a low resolution (and 1080p is low resolution). You see the aliasing very clearly. I can see it clearly on a 4k@27" display.

Having thought about it again, I think I understand you.

Even if we assume that 1080p on a 2160p display of a given size looks exactly like 1080p on a 1080p display of the same size (i.e. 100% sharp), your claim is that it still looks worse than a higher res (between 1080p and 2160p) with interpolation (whether done by GPU or LCD algorithms); and that is simply because larger pixel size = more visible edges and jaggies.

Am I getting your point correctly?

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 27 of 54, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie
dr_st wrote:
infiniteclouds wrote:

It seems many of the games I play from 2016 onward have some of the worst, horrific aliasing imaginable. Sure, older games benefited significantly from AA but nowadays a scene without AA isn't 'jaggies' it is a shimmering, vomit-inducing, shim, shim, shimmery eye massacre.

Could you post some examples?

Dishonored 2 has such horrible aliasing without using the sub-par AA methods I mentioned that it looks like the world is melting. Video capture is really more accurate than any screenshots because unlike simple jaggies it is the shimmering when moving around drives me crazy. The new RE2 demo also has pretty awful shimmering if you turn of the AA. I'm not sure what it is exactly but old school aliasing/jaggies were never as offensive as this and they could be remedied without blurring the image as much with other forms of AA that aren't compatible with deferring shading. The only solution seems to be higher resolutions or supersample scaling which seems ridiculous to me especially given the price tag attached.

Going back to my example I absolutely feel that the first Dishonored game (2012) with the settings maxed out at 1080p looks better than its 2016 sequel once you have to disable AA.

Reply 28 of 54, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

They could have a huge boost in visual realism if they allowed game developers to push down the resolution

Photorealistic 320x200!

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 29 of 54, by Scali

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote:

Photorealistic 320x200!

Heh, reminds me of the BluReu demo on C64... 160x200, 16 colour fixed palette:
https://youtu.be/W0TFsyR4YL8

Contains some real footage, so definitely 'photorealistic'.
It's amazing how much you can get out of so little 😀

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 30 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t
dr_st wrote:

Even if we assume that 1080p on a 2160p display of a given size looks exactly like 1080p on a 1080p display of the same size (i.e. 100% sharp)

Um, on an ideal gaming display (as close as you can get without having to move your head to see anything) then 1920x1080 isn't very sharp at all. And I don't believe it's possible to reproduce the pixel structure of a 1080p display on a 2160p display anyway.

dr_st wrote:

your claim is that it still looks worse than a higher res (between 1080p and 2160p) with interpolation (whether done by GPU or LCD algorithms)

My claim is that even at 1080p, interpolated scaling looks better than a perfect sharp scale because the perfectly sharp scale just gives you aliasing.

Perfectly scaling 1080p>2160p is one of those things that sound like a nice idea, but for the most part just isn't. Exception being pixel art based games of course (and even that is subjective).

dr_st wrote:

and that is simply because larger pixel size = more visible edges and jaggies.

That is how resolution works, yes?

dr_st wrote:
infiniteclouds wrote:

It seems many of the games I play from 2016 onward have some of the worst, horrific aliasing imaginable. Sure, older games benefited significantly from AA but nowadays a scene without AA isn't 'jaggies' it is a shimmering, vomit-inducing, shim, shim, shimmery eye massacre.

Could you post some examples?

I have always held the opinion that the ROI for using AA is pretty low, given its high computational cost, since most of the "terrible jaggies" people tend to post in AA-vs-no-AA images were rather mild, of the kind that seemed to only be able to bother someone if they compulsively focused on them (rather than enjoying the game).

Missed this, but try playing Alien Isolation with the built-in anti-aliasing at 1080p. It will definitely look distractingly bad at times. It even looks bad at 2160p, which is why there's thankfully a mod to add temporal anti-aliasing.

Or you can just watch a comparison video the guy who decided to fix the problem made:
https://www.youtube.com/watch?v=G6Aq7Ayoqvo
Just so many things are moving that shouldn't be.

Reply 31 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t

^^
I'm starting to think that my opinion on the merits of AA would have been different if supporters have been making their case with videos rather than stills. 😀

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 33 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t

No, I'm referring to every previous discussion where folks tried to post screenshots to demonstrate the horrors of aliasing. I found none of them even remotely convincing compared to the video in your previous post.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 34 of 54, by keenmaster486

User metadata
Rank l33t
Rank
l33t

Ultimately this question comes down to: how big physically is your monitor, and how far away are you sitting from it?

If your monitor is big enough and you're sitting close enough to it, you'll be able to tell the difference between ridiculous resolutions like 8k vs 16k or something.

I think we can agree that the theoretical maximum resolution usable by our eyes is defined by: that which is the maximum resolution resolvable by the human eye when the screen + sitting position makes the screen fill our entire field of vision, all the way from left to right, etc. You could calculate this using trigonometry and the size of our rods+cones. This would be a ridiculously large screen and honestly a crap ton of resolution. It'd be very expensive now but someday we'll reach that limit and at that point everything else will be marketing.

World's foremost 486 enjoyer.

Reply 35 of 54, by vvbee

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote:

They could have a huge boost in visual realism if they allowed game developers to push down the resolution

Photorealistic 320x200!

Well, 320 x 200 is less than 1% of 4k, so you don't need to go that low, but you will be able to tell the lack of realism much better at 4k.

Reply 36 of 54, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote:
Um, on an ideal gaming display (as close as you can get without having to move your head to see anything) then 1920x1080 isn't v […]
Show full quote
dr_st wrote:

Even if we assume that 1080p on a 2160p display of a given size looks exactly like 1080p on a 1080p display of the same size (i.e. 100% sharp)

Um, on an ideal gaming display (as close as you can get without having to move your head to see anything) then 1920x1080 isn't very sharp at all. And I don't believe it's possible to reproduce the pixel structure of a 1080p display on a 2160p display anyway.

dr_st wrote:

your claim is that it still looks worse than a higher res (between 1080p and 2160p) with interpolation (whether done by GPU or LCD algorithms)

My claim is that even at 1080p, interpolated scaling looks better than a perfect sharp scale because the perfectly sharp scale just gives you aliasing.

Perfectly scaling 1080p>2160p is one of those things that sound like a nice idea, but for the most part just isn't. Exception being pixel art based games of course (and even that is subjective).

dr_st wrote:

and that is simply because larger pixel size = more visible edges and jaggies.

That is how resolution works, yes?

dr_st wrote:
infiniteclouds wrote:

It seems many of the games I play from 2016 onward have some of the worst, horrific aliasing imaginable. Sure, older games benefited significantly from AA but nowadays a scene without AA isn't 'jaggies' it is a shimmering, vomit-inducing, shim, shim, shimmery eye massacre.

Could you post some examples?

I have always held the opinion that the ROI for using AA is pretty low, given its high computational cost, since most of the "terrible jaggies" people tend to post in AA-vs-no-AA images were rather mild, of the kind that seemed to only be able to bother someone if they compulsively focused on them (rather than enjoying the game).

Missed this, but try playing Alien Isolation with the built-in anti-aliasing at 1080p. It will definitely look distractingly bad at times. It even looks bad at 2160p, which is why there's thankfully a mod to add temporal anti-aliasing.

Or you can just watch a comparison video the guy who decided to fix the problem made:
https://www.youtube.com/watch?v=G6Aq7Ayoqvo
Just so many things are moving that shouldn't be.

Oh man Alien Isolation was awful with this -- such a great game too it was a shame. This was the game I first discovered that forced CS/MS AA from nvidia control panel would break the game because of DRendering. I am hugely surprised that it was still bad at 4K though.

Reply 38 of 54, by realnc

User metadata
Rank Oldbie
Rank
Oldbie

4K isn't even nearly enough to prevent aliasing. Here's a test:

https://www.testufo.com/aliasing-visibility#f … g=0&thickness=1

Even on smaller 4K displays, the shimmer is still there. I don't know what pixel density you'd need to get rid of it. I don't have a high-PPI phone to test. But if someone here has one (like a high-end iPhone), they can test it.

However, even at 4K, many games will not use shaders at full resolution for performance reasons. They only render geometry at full res, but shader effects are done at lower res. So shader aliasing won't get better even at super high resolutions in many games.

Reply 39 of 54, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie

Is it my imagination or is this shimmering somewhat new to the last generation or two? I always remember having jaggies without AA but I don't remember the screen dancing even as I was standing still -- what the hell?