VOGONS


Reply 40 of 47, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

@Ozzuneoj "Standard" setting mode only gives me 4 options (on FX card) :
AA mode (Off/2x/2xQ/4x/[8xS])
AF mode (Off/x2/x4/x8)
Image settings (High Performance/Performance/Quality/High Quality)
Vsync (Off/On)
By default they are all "application controlled", aside from Image settings which are set to "Quality" option.
^This is also how I tested both cards.

Advanced option menu has :
AA mode (Off/2x/2xQ/4x/[8xS])
AF mode (Off/x2/x4/x8)
Image settings (High Performance/Performance/Quality/High Quality)
Color profile : (N/A) [greyed out in my case]
Vsync (Off/On)
Force mipmaps (None/Bilinear/Trilinear)
Conformant texture clamp (Off/On')
Extension limit (Off'/On)
Trilinear optimization (Off/On')
Anisotropic map filter optimization (Off/On')
Anisotropic sample optimization (Off'/On)
Negative LOD Bias (Clamp/Allow')
Hardware acceleration (Single monitor)

Last edited by agent_x007 on 2026-02-06, 20:01. Edited 13 times in total.

Reply 41 of 47, by shevalier

User metadata
Rank Oldbie
Rank
Oldbie
predator_085 wrote on Yesterday, 18:24:

The Ultra series seems to be nea topic but they are too expensive for me

If you look at the comparison between Radeon and FX, the Ultra series is very similar to regular cards that were supposed to compete with ATI graphics cards.
At that time, we all had stripped-down FX graphics cards (not Ultra), which gave the impression that the FX series was useless.

Aopen MX3S, PIII-S Tualatin 1133, Radeon 9800Pro@XT BIOS, Audigy 4 SB0610
JetWay K8T8AS, Athlon DH-E6 3000+, Radeon HD2600Pro AGP, Audigy 2 Value SB0400
Gigabyte Ga-k8n51gmf, Turion64 ML-30@2.2GHz , Radeon X800GTO PL16, Diamond monster sound MX300

Reply 42 of 47, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on Yesterday, 17:53:
Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti […]
Show full quote

Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti given the huge performance difference. There was so much goofy stuff going on back then with the filtering optimizations... I remember some cards clearly not applying AF at certain angles to save performance. I think that may have been the Radeon 8500 series... can't remember exactly.

So, it seems like maybe the FX and Ti series are the same in high quality mode. Not sure how they compare at different settings.

EDIT: Here is the D3D AF Tester shown in some of the screenshots in that thread...
https://www.3dcenter.org/download/d3d-af-tester
I'll also attach it here for posterity.

I checked, here are results :

Ti 4800 SE :

The attachment AF tester.PNG is no longer available
The attachment AF tester FC.PNG is no longer available

FX 5600 Ultra :

The attachment AF tester FC.PNG is no longer available
The attachment AF tester.PNG is no longer available

I made two screenshots because not sure which one is better visible. I didn't changed any advanced settings, so this should represent image quality at performance I got.

My GF4 Ti 4800 SE does NOT have this options available to it (which are present for FX series card) on driver I used :
1) Trilinear optimization (Off/On)
2) Anisotropic map filter optimization (Off/On)
3) Anisotropic sample optimization (Off/On)
4) Negative LOD Bias (Clamp/Allow)
^I highlighted what settings were shown after ticking "Advanced" option box on FX 5600 Ultra.

Reply 43 of 47, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on Yesterday, 19:47:
I checked, here are results : […]
Show full quote
Ozzuneoj wrote on Yesterday, 17:53:
Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti […]
Show full quote

Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti given the huge performance difference. There was so much goofy stuff going on back then with the filtering optimizations... I remember some cards clearly not applying AF at certain angles to save performance. I think that may have been the Radeon 8500 series... can't remember exactly.

So, it seems like maybe the FX and Ti series are the same in high quality mode. Not sure how they compare at different settings.

EDIT: Here is the D3D AF Tester shown in some of the screenshots in that thread...
https://www.3dcenter.org/download/d3d-af-tester
I'll also attach it here for posterity.

I checked, here are results :

Ti 4800 SE :

The attachment AF tester.PNG is no longer available
The attachment AF tester FC.PNG is no longer available

FX 5600 Ultra :

The attachment AF tester FC.PNG is no longer available
The attachment AF tester.PNG is no longer available

I made two screenshots because not sure which one is better visible. I didn't changed any advanced settings, so this should represent image quality at performance I got.

My GF4 Ti 4800 SE does NOT have this options available to it (which are present for FX series card) on driver I used :
1) Trilinear optimization (Off/On)
2) Anisotropic map filter optimization (Off/On)
3) Anisotropic sample optimization (Off/On)
4) Negative LOD Bias (Clamp/Allow)
^I highlighted what settings were shown after ticking "Advanced" option box on FX 5600 Ultra.

Wow, that was quick!

Thank you for the screenshots. It definitely looks like the Geforce4 has more blending between the stages, with the FX having harder lines. This may be due to some of the optimizations that are selected, or it may just be inherent to the way the FX series does AF. I have no idea if this would be visible in actual games though.

This could at least partially explain how the FX series is able to run so much faster at higher AF levels despite the lower texel fill rate.

Now for some blitting from the back buffer.

Reply 44 of 47, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

After disabling AF options which only FX uses it looks like this :

The attachment AF tester opts OFF.PNG is no longer available
The attachment AF tester opts OFF FC.PNG is no longer available

^Tweaked settings :
1) Trilinear optimization (Off/On)
2) Anisotropic map filter optimization (Off/On)
3) Anisotropic sample optimization (Off/On)
4) Negative LOD Bias (Clamp/Allow) (<= this one is to match GF4)

Reply 45 of 47, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on Yesterday, 21:36:
After disabling AF options which only FX uses it looks like this : […]
Show full quote

After disabling AF options which only FX uses it looks like this :

The attachment AF tester opts OFF.PNG is no longer available
The attachment AF tester opts OFF FC.PNG is no longer available

^Tweaked settings :
1) Trilinear optimization (Off/On)
2) Anisotropic map filter optimization (Off/On)
3) Anisotropic sample optimization (Off/On)
4) Negative LOD Bias (Clamp/Allow) (<= this one is to match GF4)

Oh, nice!

It's interesting that the top image (with the red background) shows a very different pattern from the Geforce 4 Ti regardless of the optimizations, but the lower picture looks basically the same between the two once the optimizations are turned off on the FX.

This begs the question of course... how much does performance change with the optimizations off? That seems to be the closest representation of image quality between the FX and the Geforce 4 (based on the test images with the black and white background).

Of course, if there is no noticeable difference in image quality during gameplay with those optimizations enabled or disabled, then the benchmark numbers with optimizations on are a perfectly valid comparison as well. 🙂

Now for some blitting from the back buffer.

Reply 46 of 47, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Well, 3DMark 01SE doesn't give a... about optimization (scores are the same).

The attachment 3DMark 01SE AF x8 no optims.PNG is no longer available

11620 (with opts) vs. 11701 (without opts).

Codecreatures though :

The attachment Codecreatures AF x8 no optim.PNG is no longer available

2665 (with opts) vs. 2067 (without opts), which is 78% of optimization performance 😁

As always, everything d-pends on engine/program...

Reply 47 of 47, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on Yesterday, 22:52:
Well, 3DMark 01SE doesn't give a... about optimization (scores are the same). […]
Show full quote

Well, 3DMark 01SE doesn't give a... about optimization (scores are the same).

The attachment 3DMark 01SE AF x8 no optims.PNG is no longer available

11620 (with opts) vs. 11701 (without opts).

Codecreatures though :

The attachment Codecreatures AF x8 no optim.PNG is no longer available

2665 (with opts) vs. 2067 (without opts), which is 78% of optimization performance 😁

As always, everything d-pends on engine/program...

More interesting results! Thank you!

The Codecreatures benchmark seems to make sense to me. The 5600 Ultra has a much higher pixel fill rate (since your GF4 is currently clocked at 250Mhz core) and it has quite a bit higher memory bandwidth, but the lower texel fill rate would bring the anisotropic filtering performance back down... so they aren't that far apart at that point.

There can definitely be a lot of variation between games\programs as to how they respond to various graphical settings and tweaks.

Still... for 3Dmark to have no performance drop at all when disabling the optimizations is pretty suspicious . To be honest... I would not be surprised if in this case it is because the driver is optimizing specifically for 3DMark. It might be worth checking to see if the driver has an existing profile specifically for 3Dmark 2001SE that turns those filtering optimizations on. Even if it doesn't have a profile, there was so much garbage going on at this time with companies trying to get the edge in 3DMark to "win" the benchmarks, I wouldn't put it past them to have hard-coded the optimizations when running 3DMark, regardless of visible settings.

Hard to forget stuff like this with the 43.51 and 44.03 drivers in 2003:
https://www.overclockers.com/forums/threads/f … 4/#post-1797376

What Are The Identified Cheats? […]
Show full quote

What Are The Identified Cheats?

Futuremark’s audit revealed cheats in NVIDIA Detonator FX 44.03 and 43.51 WHQL drivers. Earlier GeForceFX drivers include only some of the cheats listed below.

1. The loading screen of the 3DMark03 test is detected by the driver. This is used by the driver to disregard the back buffer clear command that 3DMark03 gives. This incorrectly reduces the
workload. However, if the loading screen is rendered in a different manner, the driver seems to fail to detect 3DMark03, and performs the back buffer clear command as instructed.

2. A vertex shader used in game test 2 (P_Pointsprite.vsh) is detected by the driver. In this case the driver uses instructions contained in the driver to determine when to obey the back buffer
clear command and when not to. If the back buffer would not be cleared at all in game test 2, the stars in the view of outer space in some cameras would appear smeared as have been reported in the articles mentioned earlier. Back buffer clearing is turned off and on again so that the back buffer is cleared only when the default benchmark cameras show outer space. In free camera mode one can keep the camera outside the spaceship through the entire test, and see how the sky smearing is turned on and off.

3. A vertex shader used in game test 4 (M_HDRsky.vsh) is detected. In this case the driver adds two static clipping planes to reduce the workload. The clipping planes are placed so that the
sky is cut out just beyond what is visible in the default camera angles. Again, using the free camera one can look at the sky to see it abruptly cut off. Screenshot of this view was also
reported in the ExtremeTech and Beyond3D articles. This cheat was introduced in the 43.51 drivers as far as we know.

4. In game test 4, the water pixel shader (M_Water.psh) is detected. The driver uses this detection to artificially achieve a large performance boost - more than doubling the early
frame rate on some systems. In our inspection we noticed a difference in the rendering when compared either to the DirectX reference rasterizer or to those of other hardware. It appears
the water shader is being totally discarded and replaced with an alternative more efficient shader implemented in the drivers themselves. The drivers produce a similar looking rendering, but not an identical one.

5. In game test 4 there is detection of a pixel shader (m_HDRSky.psh). Again it appears the shader is being totally discarded and replaced with an alternative more efficient shader in a similar fashion to the water pixel shader above. The rendering looks similar, but it is not identical.

6. A vertex shader (G_MetalCubeLit.vsh) is detected in game test 1. Preventing this detection proved to reduce the frame rate with these drivers, but we have not yet determined the cause.

7. A vertex shader in game test 3 (G_PaintBaked.vsh) is detected, and preventing this detection drops the scores with these drivers. This cheat causes the back buffer clearing to be
disregarded; we are not yet aware of any other cheats.

8. The vertex and pixel shaders used in the 3DMark03 feature tests are also detected by the driver. When we prevented this detection, the performance dropped by more than a factor of
two in the 2.0 pixel shader test.

Now for some blitting from the back buffer.