VOGONS

Common searches


Reply 40 of 54, by badmojo

User metadata
Rank l33t
Rank
l33t
infiniteclouds wrote:

Is it my imagination or is this shimmering somewhat new to the last generation or two? I always remember having jaggies without AA but I don't remember the screen dancing even as I was standing still -- what the hell?

Not just you! Jaggies bother me but not nearly as much as the muddy, shimmering mess we've been offered in recent years.

Life? Don't talk to me about life.

Reply 41 of 54, by leileilol

User metadata
Rank l33t++
Rank
l33t++

It's been worse since the popularity of deferred rendering and screen-space shaders (fast AO with low samples and reflections bring out the shimmer). All that 'fast' AA tech seems moot to them, you'd have to SSAA it off.

apsosig.png
long live PCem

Reply 42 of 54, by Scali

User metadata
Rank l33t
Rank
l33t

Yea, the jaggies are 'somewhat' new...
Thing is, for regular texturing, MSAA was developed as a good way to handle texture aliasing at polygon edges. The rest of the polygon was antialiased by bilinear/trilinear/anisotropic filtering.
But then advanced shading methods appeared, like bumpmapping, displacement mapping, horizon mapping etc. And guess what? MSAA doesn't handle those, neither do the conventional texture filtering techniques. Why not? Because those 'bumps' and things aren't just on polygon edges, so MSAA doesn't handle them. And the bumps aren't simple linear functions, so conventional filtering does not predict the aliasing, since that filtering assumes flat polygon surfaces.
So per-pixel lighting tricks are very susceptible to aliasing anyway. And yes, deferred rendering techniques make it even more difficult, because the MSAA and other geometry info get lost between passes.
SSAA isn't the only way, but it certainly is a good way to solve it.
Other approaches include encoding extra data into your normalmaps and/or other textures, to avoid aliasing during shading itself.
I recall 3DMark05 using an interesting trick where they had 'denormalized' normalmaps. As in: the normals that were stored, were not necessarily unit length. The variation in length would act as a scaling factor during shading. When the normalmap was sampled via a bilinear filter, the length was 'averaged', resulting in somewhat smoother shading.
Which was fun, when ATi tried to claim that 3DMark05 was being unfair to them because they didn't use 3DC texture compression. But 3DC is designed to always compress/decompress unit vectors. So it was entirely incompatible with this antialiasing trick.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 43 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

It's been worse since the popularity of deferred rendering and screen-space shaders (fast AO with low samples and reflections bring out the shimmer). All that 'fast' AA tech seems moot to them, you'd have to SSAA it off.

Temporal antialiasing works fine and is much cheaper than SSAA.

See AliasIsolation for a pretty good example of this. There's basically no aliasing.

Not that the people are obsessed with sharpness will think that's an adequate solution.

Reply 45 of 54, by appiah4

User metadata
Rank l33t++
Rank
l33t++

I find AMD's GPU Scaling option in the driver to be a more than satisfactory way of dealing with upscaling content below the resolution of a fixed resolution panel.

That said, I still use a 1080p panel with antialiasing on my RX480 8GB. Given the chance, I would immediately jump ship to a 4K panel though. People who say 4K does not make a difference or higher resolution does not increase image quality make no sense to me. More resolution is never worse, and no form of antialiasing that I have tried can be a substitute for higher resolution.

I absolutely hate most newer forms of AA such as MLAA, SMAA, FXAA and TXAA by the way. SSAA was great, MSAA was good enough for its time but the options we have for the current rendering methods look like terrible smear filters to me.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 46 of 54, by realnc

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote:

Temporal antialiasing works fine and is much cheaper than SSAA.

See AliasIsolation for a pretty good example of this. There's basically no aliasing.

Not that the people are obsessed with sharpness will think that's an adequate solution.

It's not just blur. The temporal artifacts can actually be worse than the blur. With TAA, you can get a trail of previous frames. "Ghosting." Some people think it's part of the game's aesthetic when they see this trail. Nope. It's a TAA artifact.

However, I still find this preferable to shimmering. The TAA artifacts are severe in Dragon Quest 11 for example, but I still use it. I just can't stand the shimmering.

Reply 47 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t
appiah4 wrote:

I find AMD's GPU Scaling option in the driver to be a more than satisfactory way of dealing with upscaling content below the resolution of a fixed resolution panel.

That said, I still use a 1080p panel with antialiasing on my RX480 8GB. Given the chance, I would immediately jump ship to a 4K panel though. People who say 4K does not make a difference or higher resolution does not increase image quality make no sense to me. More resolution is never worse, and no form of antialiasing that I have tried can be a substitute for higher resolution.

I absolutely hate most newer forms of AA such as MLAA, SMAA, FXAA and TXAA by the way. SSAA was great, MSAA was good enough for its time but the options we have for the current rendering methods look like terrible smear filters to me.

Post-processing filters do look much worse on lower resolutions so I wouldn't be surprised you think they look like smear filters at 1080p. Obviously they're not perfect at 2160p either, but I would be very surprised if you noticed a good SMAA implementation unless you're sitting way to close to your monitor/TV.

Temporal antialiasing is different though, the motion artifacts those methods introduce are easier to notice and there's a lot more that can go wrong in the implementation. I'm still thinking it should have less artifacts the higher your framerate/resolution is and it's still better than going with a higher SSAA level (cost/performance ratio).

appiah4 wrote:

I find AMD's GPU Scaling option in the driver to be a more than satisfactory way of dealing with upscaling content below the resolution of a fixed resolution panel.

Depends very much on your display. I prefer to just use GPU scaling so it's more consistent between my setups though.

Reply 49 of 54, by silikone

User metadata
Rank Member
Rank
Member
infiniteclouds wrote:

Is it my imagination or is this shimmering somewhat new to the last generation or two? I always remember having jaggies without AA but I don't remember the screen dancing even as I was standing still -- what the hell?

Do you by any chance enable FXAA?
These intraframe post-process AA solutions do nothing but exacerbate the problem by cleaning up jaggies without solving the flickering and pixel crawling.
TAA, as blurry as it is, at least gets rid of shimmering, but the ghosting artifacts you can get are by themselves often not worth it, especially with a performance penalty.
4K is the only real solution to the visual garbage problem. I'm certainly not paying the premium for it, as I don't mind a little pure framebuffer aliasing.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 50 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t
silikone wrote:
infiniteclouds wrote:

Is it my imagination or is this shimmering somewhat new to the last generation or two? I always remember having jaggies without AA but I don't remember the screen dancing even as I was standing still -- what the hell?

Do you by any chance enable FXAA?
These intraframe post-process AA solutions do nothing but exacerbate the problem by cleaning up jaggies without solving the flickering and pixel crawling.

From my experience that isn't true. Sure post-process AA don't necessarily help with some types of aliasing, but it's pretty rare that it makes it worse.

At least not on 1440p/2160p displays, haven't tested this much on 1080p displays.

Reply 51 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t

So I wanted to know if 1080p could be displayed similarly to 1080p displays on a 4K display and I decided to try the simple thing:

test5.png
Filename
test5.png
File size
1.81 MiB
Views
939 views
File license
Fair use/fair dealing exception

And it works pretty great: almost no perceived aliasing or blurring. Looks almost exactly like my 1080p monitor.

Unfortunately it has two pretty severe problems. Like console scanlines there's a pretty high brightness cost and there's no way to practically use it for games.

Still would be more generally useful than the integer scaling everyone is begging AMD and Nvidia to implement. I wonder if most of them realize what they're asking for.

Reply 52 of 54, by VileR

User metadata
Rank l33t
Rank
l33t

Not that I'm the target audience here, because I haven't looked all that deeply into the issues of upscaling 1080p material. But integer scaling is important enough when dealing with even lower resolution material (emulated older games, etc.). Optimally we'd be able to have non-interpolated scaling up to the nearest integer multiple, then interpolate (properly!) to the actual target resolution. And yes, in my case you may I assume I realize what I'm asking for.

You can say I'm fairly obsessed with sharpness, but to be honest, interpolated ("blurry") scaling typically looks as horrible as it does only because there's no gamma correction before and after the filter is applied. When that's taken care of, the results are a lot better. If newer GPUs actually do this, I'd be pleasantly surprised.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 53 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t
VileRancour wrote:

Not that I'm the target audience here, because I haven't looked all that deeply into the issues of upscaling 1080p material. But integer scaling is important enough when dealing with even lower resolution material (emulated older games, etc.).

Lower resolution content requires different scaling than higher resolution content. Also different content (subjectively) requires different scaling.

VileRancour wrote:

And yes, in my case you may I assume I realize what I'm asking for.

Do you really prefer the nearest scale Legend of Grimrock 2 screenshot I posted to the lanczos one? Or the "scanline" one?

VileRancour wrote:

You can say I'm fairly obsessed with sharpness, but to be honest, interpolated ("blurry") scaling typically looks as horrible as it does only because there's no gamma correction before and after the filter is applied. When that's taken care of, the results are a lot better. If newer GPUs actually do this, I'd be pleasantly surprised.

Not sure, but GPU scaling isn't impressive. It's simple bilinear scaling that's poorly suited for most content. People ask for it to be fixed by asking for integer scaling that's also poorly suited for most content (and content resolution + display resolution combinations).

Reply 54 of 54, by realnc

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote:

Unfortunately it has two pretty severe problems. Like console scanlines there's a pretty high brightness cost and there's no way to practically use it for games.

I run some games in 1080p on a 1440p display, and I use ReShade's "Luma Sharpen" shader. To the naked eye, I can't tell much if any difference between native 1080p and upscaled 1440p, unless I look at the image so close that my nose is touching the display.

A value of about 0.6 gives me the best results. The value you need might be different when upscaling to 4K though. It also depends on whether you're using GPU scaling or display scaling (some displays already apply some sharpening when upscaling, or use an upscaling method that is less blurry than the bilinear filter used by GPU scaling, so a lower Luma Sharpen value might be better.)

Some games have a built-in sharpening shader, in which case you could use than instead of ReShade.

Anyway, long story short: post-process sharpening works really well for running games at lower than native resolution!