VOGONS


First post, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Thread is about comparisons of GeForce and Radeon texture filtering. Mostly regarding DirectX 9 and later cards.

---

Radeon HD cards have a texture LOD bug with Source engine that causes extreme texture aliasing. It's basically just way too sharp. This occurs in at least HL2 and Dark Messiah of Might & Magic. You need to drop its mip map quality down to performance in the control panel. From what I understand this makes these cards behave more like NVidia. I did some web searches and apparently this issue had some controversy back then. It's not limited to Source but there is something extra special going on there. I am not sure how the 5000 series behaves.

You will also certainly run into OpenGL issues with older games and a Radeon. Especially after Catalyst 7.11 (which can only be used with 3870 and older). 7.12 has an overhauled OpenGL driver that removed some old extensions. It actually also is a breaking point for bump mapping with the D3D8 Star Wars Republic Commando.

The GTX 580 is an excellent card. Super fast and quiet. I did find that I needed to use older drivers for MSAA to work with HL2 though. It can obviously run quite a wide range of driver releases and it conveniently works great with the last driver released for the 8800 series which I also played a bit with. No visual issues with these cards.

Reply 1 of 27, by kolderman

User metadata
Rank l33t
Rank
l33t

@swaaye which HD card did you use? I had heard of issues with 4000 series but the 5 series have been stable for me even with latest drivers it supports (5850). I tend to play only dx9 games though.

Will the issue be obvious with a normal install of hl2 and standard settings? Did you have a main link that talked about the issues?

Also, what was the last 8800 driver version for you? My search nets me 340.52 (2014).

Reply 2 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++
kolderman wrote:

Will the issue be obvious with a normal install of hl2 and standard settings? Did you have a main link that talked about the issues?

The problems I was seeing seemed to be a rather straightfoward LOD bias quirk. For some reason with some games ATI has things cranked way too sharp and you get tons of texture aliasing. I think you will see the problems with any version of Half Life 2. Also with Episode 1. And certainly Dark Messiah.

Here's an image that makes it obvious. Look at the floor in the distance. If you can see that aliasing in a still image, guess how it looks in motion. I saw this with a 2900 XT and a 4890. Probably a X850 and X1950 too but my brain is fuzzy on that.
lNe0y6e.jpg

Some links about the filtering on the HD cards. Mostly about the 4000-5000 cards.
(5000 series) http://www.gpu-tech.org/content.php/137-A-clo … -HD-5000-series
(5000 series) http://alienbabeltech.com/main/ati-5770-image … ality-analysis/
(4000-5000 series talk) https://forum.beyond3d.com/threads/nvidia-or- … g-broken.49106/
(7870) https://forums.guru3d.com/threads/the-great-l … st-know.398019/

I think there are a few different issues that were identified with the cards. The LOD thing is probably just their driver doing what they want (for some reason). Perhaps ATI identified that sharper LOD is more pleasing in screenshots and if you don't enable AF it looks better than plain bilinear/trilinear in most cases. This is remedied by dropping the mip mapping slider to Performance (IIRC). I am not sure if that is lower quality than what NVIDIA renders by default but it makes the Radeon image much less irritating. There are also some quirks with the mip mapping in general but those would be less obvious than this LOD issue.

There was actually an official statement from ATI regarding this LOD sharpness for the X800 cards. So yeah it seems like a conscious decision to cause this problematic sharp filtering.
https://www.computerbase.de/2004-05/offiziell … exturfilterung/

Reply 3 of 27, by kolderman

User metadata
Rank l33t
Rank
l33t

Thanks. I'm installing a version of HL2 from way back in late 2004 now (judging by file timestamps), on a WinXP box with a 5850. Latest drivers. I'll see how it looks. You don't have a comparison screenshot of that scene with a GTX280 by any chance?

Reply 4 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++
kolderman wrote:

Thanks. I'm installing a version of HL2 from way back in late 2004 now (judging by file timestamps), on a WinXP box with a 5850. Latest drivers. I'll see how it looks. You don't have a comparison screenshot of that scene with a GTX280 by any chance?

No I don't have any shots. That one above is actually from the guru3d forum post. But I can say that the GeForce 8800 and 580 look nothing like that shot.

Reply 5 of 27, by kolderman

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Here's an image that makes it obvious. Look at the floor in the distance. If you can see that aliasing in a still image, guess how it looks in motion. I saw this with a 2900 XT and a 4890. Probably a X850 and X1950 too but my brain is fuzzy on that.

@swaaye I just took some screenshots from plain 2004 HL2, specs as mentioned before (5850 latest drivers, dx9, winXP). I don't see the aliasing, it just blurs into the distance. I had both AI texture quality and mipmap setting set to highest quality. Changing both to lowest (performance) did not appear to result in any noticeable difference. In-game settings were bog standard except for changing res to 1280x1024 and 4x AA.

What do you think?

Attachments

Reply 6 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++
kolderman wrote:

@swaaye I just took some screenshots from plain 2004 HL2, specs as mentioned before (5850 latest drivers, dx9, winXP). I don't see the aliasing, it just blurs into the distance. I had both AI texture quality and mipmap setting set to highest quality. Changing both to lowest (performance) did not appear to result in any noticeable difference. In-game settings were bog standard except for changing res to 1280x1024 and 4x AA.

What do you think?

Try turning on anisotropic filtering in the game and see what happens.

kolderman wrote:

The 5870 is a lot less, but what is interesting to me is that the 5850 uses much less power again (almost 40W less) than the 5870 but is not much slower, certainly not an issue for XP era games, which is why I settled on the 5770/5850 for that era. There may be issues with dx8 games on HD5x series cards, but I only play dx9 titles there.

Yeah the 580 is a power hog but the stock cooler is actually less irritating than what you'll typically find with ATI. 580 is surprisingly quiet. Its fan also runs a fairly consistent speed instead of ramping up and down a lot. Your 58x0 card is certainly one of ATI's better efforts though.

Reply 7 of 27, by kolderman

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
kolderman wrote:

@swaaye I just took some screenshots from plain 2004 HL2, specs as mentioned before (5850 latest drivers, dx9, winXP). I don't see the aliasing, it just blurs into the distance. I had both AI texture quality and mipmap setting set to highest quality. Changing both to lowest (performance) did not appear to result in any noticeable difference. In-game settings were bog standard except for changing res to 1280x1024 and 4x AA.

What do you think?

Try turning on anisotropic filtering in the game.

OK I'll try that. Here it is with AA turned off to show it wasn't being blurred out that way. You can clearly see the AA difference in the cable across the floor, but the tiles look if anything more blurred with AA off.

Attachments

Reply 8 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah that looks like clean trilinear filtering.

I think the textures in the screenshot I posted are higher resolution. Must be better textures included with the modern versions of HL2.

Reply 9 of 27, by kolderman

User metadata
Rank l33t
Rank
l33t

So I forced on AFx16 in the driver and in the game and they were about equal. I guess very vague aliasing, not quite as much as your image.

Attachments

Reply 11 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I decided to record the GTX 580 and 4890 doing the same little walk forward in HL2 EP1. Unfortunately the PCIe x16 slot on my DFI NF4 board has broken in some way (it stopped POSTing with a card in that slot). So I had to build a new setup. Now I am using a i5-2500K and can capture at 1080p 60fps.

I also reencoded them with FFMPEG at a super high quality H.264 setting instead of the raw FRAPS file because this is about 1/15th the original size. Quality is basically unchanged.

4890 Cat 13.1.mp4_snapshot_00.00_[2019.09.21_13.38.24].jpg
Filename
4890 Cat 13.1.mp4_snapshot_00.00_[2019.09.21_13.38.24].jpg
File size
482.91 KiB
Views
1197 views
File license
Fair use/fair dealing exception

1080p60 recordings with identical settings. 4X MSAA, 16X AF, Very High texture quality
(mediafire hosted mp4 files) 50-60MB
4890_Cat_13.1.mp4
GTX_580_391.35.mp4

The GTX 580 has considerably less texture shimmering all over the scene.

Reply 13 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I found it quite interesting to run into this. I'd been seeing people complaining about shimmering textures on their Radeons for years but I didn't really notice it until playing these Source engine games on various cards in a short period of time. You can even find me doubting the complaints in that old Beyond3D thread I posted earlier. 🤣

Here are two more videos of that HL2 EP1 scene.
8800 Ultra
Radeon X1950

The X1950 looks perhaps slightly better than the 4890. There is less texture swimming in medium distance on the ground. Perhaps a AF mip map transition filtering quirk for the RV7x0 chip. It also slows down to about 45fps from a solid 60 fps when FRAPS starts recording for some reason (hence the choppy video). The 8800 Ultra looks a lot like the GTX 580's output.

Attached below is the save file for the spot I used for the videos.

Attachments

Reply 14 of 27, by Scali

User metadata
Rank l33t
Rank
l33t

In short it's a complex combination of drivers and hardware.
There will always be limitations of filtering inherent in the hardware design.
But what makes it more difficult is that this is nice 'low hanging fruit' in terms of saving bandwidth and thereby increasing performance.
So the filter hardware tends to be somewhat configurable/programmable. How configurable/programmable it is exactly, only the IHV knows.
The end-user may get a slider for 'texture quality' and perhaps some other options, but these are rarely documented much, if at all.
There's no way to know if setting the controls to maximum quality will actually put the hardware in the best possible quality mode at all. On top of that, there's no way to know if these settings will actually be applied in all games, or if there are application-specific overrides in place.

Especially in those days, say the SM2.0/3.0/4.0 days, quality was up for grabs.
Sometimes even a driver update could yield different texturing results.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 15 of 27, by leileilol

User metadata
Rank l33t++
Rank
l33t++

I've studied the GeforceFX's filtering at one point. It's the same variable-steps-between-texel-color-difference stuff the PowerVRs do but with a higher max. The blockiness is most visible on lightmaps with smooth light changes

gffx-texel16overblended.png
Filename
gffx-texel16overblended.png
File size
4.32 KiB
Views
1117 views
File license
Fair use/fair dealing exception
gffx-texel.png
Filename
gffx-texel.png
File size
11.87 KiB
Views
1117 views
File license
Fair use/fair dealing exception
geforc.png
Filename
geforc.png
File size
53.48 KiB
Views
1117 views
File license
Fair use/fair dealing exception

I had originally intended to make fragment shaders to replicate the filtering but couldn't get the right math down. Maybe a 512x256 texture LUT could do this (a 256x256 per coord, going from 0 to 255 difference to 0 to 0 difference)

Also for comparing LODs, one could try to assemble a thread full of Q3 devmap q3dm1 r_colorMipLevels 1 shots 😉 here's a Geforce2...

geforce2-mip-blinlinear.png
Filename
geforce2-mip-blinlinear.png
File size
177.54 KiB
Views
1115 views
File license
Fair use/fair dealing exception

apsosig.png
long live PCem

Reply 17 of 27, by kolderman

User metadata
Rank l33t
Rank
l33t
kolderman wrote:

Found a gtx 560 at a local buy/sell shop. This will be my afternoon project.

And here are the results. Using the latest drivers for the 8800 Ultra (2014 which also support the 560). AFx16, with and without AAx4.

I am not convinced the GF is better. It seems to have a bit of aliasing around the cable, and gets more blurry at distance than the 5850.

What do you think?

Attachments

Reply 18 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The reason it is less detailed at distance than the 5850 is the Radeons run a LOD bias that makes the mip maps sharper. This is also part of the cause of the extreme Moire patterns with AF enabled. Read some of the Beyond3d thread I posted.

I suppose it's subjective as to which is better. The Radeons will be sharper. But I can't stand the Moire patterns when AF is enabled. It is possible to change LOD bias on both Radeon and GeForce cards. The mip map quality slider on ATI does that IIRC.

However, there are some problems with the filtering in general on HD 5000 and older. Mip map boundaries are sometimes visible, particularly with AF enabled.

ATI claimed that HD 6000 improved filtering to some degree. Unfortunately my old 6950 died early this year.

Attachments

  • AF.jpg
    Filename
    AF.jpg
    File size
    249.25 KiB
    Views
    1006 views
    File license
    Fair use/fair dealing exception

Reply 19 of 27, by kolderman

User metadata
Rank l33t
Rank
l33t

I wonder if ATI made some driver improvements years after those articles were written. My next plan is to try Republic Commando on both and see if the dx8 support is still broken. Does the Gog release of that game make it use dx9 or newer or will it still have the problem?

I am also running the d3d AF tester to try and reproduce the findings of those articles but on the newest drivers for 5850