VOGONS


Reply 40 of 55, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

@Ozzuneoj "Standard" setting mode only gives me 4 options (on FX card) :
AA mode (Off/2x/2xQ/4x/[8xS])
AF mode (Off/x2/x4/x8)
Image settings (High Performance/Performance/Quality/High Quality)
Vsync (Off/On)
By default they are all "application controlled", aside from Image settings which are set to "Quality" option.
^This is also how I tested both cards.

Advanced option menu has :
AA mode (Off/2x/2xQ/4x/[8xS])
AF mode (Off/x2/x4/x8)
Image settings (High Performance/Performance/Quality/High Quality)
Color profile : (N/A) [greyed out in my case]
Vsync (Off/On)
Force mipmaps (None/Bilinear/Trilinear)
Conformant texture clamp (Off/On')
Extension limit (Off'/On)
Trilinear optimization (Off/On')
Anisotropic map filter optimization (Off/On')
Anisotropic sample optimization (Off'/On)
Negative LOD Bias (Clamp/Allow')
Hardware acceleration (Single monitor)

Last edited by agent_x007 on 2026-02-06, 20:01. Edited 13 times in total.

Reply 41 of 55, by shevalier

User metadata
Rank Oldbie
Rank
Oldbie
predator_085 wrote on 2026-02-06, 18:24:

The Ultra series seems to be nea topic but they are too expensive for me

If you look at the comparison between Radeon and FX, the Ultra series is very similar to regular cards that were supposed to compete with ATI graphics cards.
At that time, we all had stripped-down FX graphics cards (not Ultra), which gave the impression that the FX series was useless.

Aopen MX3S, PIII-S Tualatin 1133, Radeon 9800Pro@XT BIOS, Audigy 4 SB0610
JetWay K8T8AS, Athlon DH-E6 3000+, Radeon HD2600Pro AGP, Audigy 2 Value SB0400
Gigabyte Ga-k8n51gmf, Turion64 ML-30@2.2GHz , Radeon X800GTO PL16, Diamond monster sound MX300

Reply 42 of 55, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on 2026-02-06, 17:53:
Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti […]
Show full quote

Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti given the huge performance difference. There was so much goofy stuff going on back then with the filtering optimizations... I remember some cards clearly not applying AF at certain angles to save performance. I think that may have been the Radeon 8500 series... can't remember exactly.

So, it seems like maybe the FX and Ti series are the same in high quality mode. Not sure how they compare at different settings.

EDIT: Here is the D3D AF Tester shown in some of the screenshots in that thread...
https://www.3dcenter.org/download/d3d-af-tester
I'll also attach it here for posterity.

I checked, here are results :

Ti 4800 SE :

The attachment AF tester.PNG is no longer available
The attachment AF tester FC.PNG is no longer available

FX 5600 Ultra :

The attachment AF tester FC.PNG is no longer available
The attachment AF tester.PNG is no longer available

I made two screenshots because not sure which one is better visible. I didn't changed any advanced settings, so this should represent image quality at performance I got.

My GF4 Ti 4800 SE does NOT have this options available to it (which are present for FX series card) on driver I used :
1) Trilinear optimization (Off/On)
2) Anisotropic map filter optimization (Off/On)
3) Anisotropic sample optimization (Off/On)
4) Negative LOD Bias (Clamp/Allow)
^I highlighted what settings were shown after ticking "Advanced" option box on FX 5600 Ultra.

Reply 43 of 55, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on 2026-02-06, 19:47:
I checked, here are results : […]
Show full quote
Ozzuneoj wrote on 2026-02-06, 17:53:
Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti […]
Show full quote

Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti given the huge performance difference. There was so much goofy stuff going on back then with the filtering optimizations... I remember some cards clearly not applying AF at certain angles to save performance. I think that may have been the Radeon 8500 series... can't remember exactly.

So, it seems like maybe the FX and Ti series are the same in high quality mode. Not sure how they compare at different settings.

EDIT: Here is the D3D AF Tester shown in some of the screenshots in that thread...
https://www.3dcenter.org/download/d3d-af-tester
I'll also attach it here for posterity.

I checked, here are results :

Ti 4800 SE :

The attachment AF tester.PNG is no longer available
The attachment AF tester FC.PNG is no longer available

FX 5600 Ultra :

The attachment AF tester FC.PNG is no longer available
The attachment AF tester.PNG is no longer available

I made two screenshots because not sure which one is better visible. I didn't changed any advanced settings, so this should represent image quality at performance I got.

My GF4 Ti 4800 SE does NOT have this options available to it (which are present for FX series card) on driver I used :
1) Trilinear optimization (Off/On)
2) Anisotropic map filter optimization (Off/On)
3) Anisotropic sample optimization (Off/On)
4) Negative LOD Bias (Clamp/Allow)
^I highlighted what settings were shown after ticking "Advanced" option box on FX 5600 Ultra.

Wow, that was quick!

Thank you for the screenshots. It definitely looks like the Geforce4 has more blending between the stages, with the FX having harder lines. This may be due to some of the optimizations that are selected, or it may just be inherent to the way the FX series does AF. I have no idea if this would be visible in actual games though.

This could at least partially explain how the FX series is able to run so much faster at higher AF levels despite the lower texel fill rate.

Now for some blitting from the back buffer.

Reply 44 of 55, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

After disabling AF options which only FX uses it looks like this :

The attachment AF tester opts OFF.PNG is no longer available
The attachment AF tester opts OFF FC.PNG is no longer available

^Tweaked settings :
1) Trilinear optimization (Off/On)
2) Anisotropic map filter optimization (Off/On)
3) Anisotropic sample optimization (Off/On)
4) Negative LOD Bias (Clamp/Allow) (<= this one is to match GF4)

Reply 45 of 55, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on 2026-02-06, 21:36:
After disabling AF options which only FX uses it looks like this : […]
Show full quote

After disabling AF options which only FX uses it looks like this :

The attachment AF tester opts OFF.PNG is no longer available
The attachment AF tester opts OFF FC.PNG is no longer available

^Tweaked settings :
1) Trilinear optimization (Off/On)
2) Anisotropic map filter optimization (Off/On)
3) Anisotropic sample optimization (Off/On)
4) Negative LOD Bias (Clamp/Allow) (<= this one is to match GF4)

Oh, nice!

It's interesting that the top image (with the red background) shows a very different pattern from the Geforce 4 Ti regardless of the optimizations, but the lower picture looks basically the same between the two once the optimizations are turned off on the FX.

This begs the question of course... how much does performance change with the optimizations off? That seems to be the closest representation of image quality between the FX and the Geforce 4 (based on the test images with the black and white background).

Of course, if there is no noticeable difference in image quality during gameplay with those optimizations enabled or disabled, then the benchmark numbers with optimizations on are a perfectly valid comparison as well. 🙂

Now for some blitting from the back buffer.

Reply 46 of 55, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Well, 3DMark 01SE doesn't give a... about optimization (scores are the same).

The attachment 3DMark 01SE AF x8 no optims.PNG is no longer available

11620 (with opts) vs. 11701 (without opts).

Codecreatures though :

The attachment Codecreatures AF x8 no optim.PNG is no longer available

2665 (with opts) vs. 2067 (without opts), which is 78% of optimization performance 😁

As always, everything d-pends on engine/program...

Reply 47 of 55, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on 2026-02-06, 22:52:
Well, 3DMark 01SE doesn't give a... about optimization (scores are the same). […]
Show full quote

Well, 3DMark 01SE doesn't give a... about optimization (scores are the same).

The attachment 3DMark 01SE AF x8 no optims.PNG is no longer available

11620 (with opts) vs. 11701 (without opts).

Codecreatures though :

The attachment Codecreatures AF x8 no optim.PNG is no longer available

2665 (with opts) vs. 2067 (without opts), which is 78% of optimization performance 😁

As always, everything d-pends on engine/program...

More interesting results! Thank you!

The Codecreatures benchmark seems to make sense to me. The 5600 Ultra has a much higher pixel fill rate (since your GF4 is currently clocked at 250Mhz core) and it has quite a bit higher memory bandwidth, but the lower texel fill rate would bring the anisotropic filtering performance back down... so they aren't too far apart at that point.

There can definitely be a lot of variation between games\programs as to how they respond to various graphical settings and tweaks.

Still... for 3Dmark to have no performance drop at all when disabling the optimizations is pretty suspicious . To be honest... I would not be surprised if in this case it is because the driver is optimizing specifically for 3DMark. It might be worth checking to see if the driver has an existing profile specifically for 3Dmark 2001SE that turns those filtering optimizations on. Even if it doesn't have a profile, there was so much garbage going on at this time with companies trying to get the edge in 3DMark to "win" the benchmarks, I wouldn't put it past them to have hard-coded the optimizations when running 3DMark, regardless of visible settings.

Hard to forget stuff like this with the 43.51 and 44.03 drivers in 2003:
https://www.overclockers.com/forums/threads/f … 4/#post-1797376

What Are The Identified Cheats? […]
Show full quote

What Are The Identified Cheats?

Futuremark’s audit revealed cheats in NVIDIA Detonator FX 44.03 and 43.51 WHQL drivers. Earlier GeForceFX drivers include only some of the cheats listed below.

1. The loading screen of the 3DMark03 test is detected by the driver. This is used by the driver to disregard the back buffer clear command that 3DMark03 gives. This incorrectly reduces the
workload. However, if the loading screen is rendered in a different manner, the driver seems to fail to detect 3DMark03, and performs the back buffer clear command as instructed.

2. A vertex shader used in game test 2 (P_Pointsprite.vsh) is detected by the driver. In this case the driver uses instructions contained in the driver to determine when to obey the back buffer
clear command and when not to. If the back buffer would not be cleared at all in game test 2, the stars in the view of outer space in some cameras would appear smeared as have been reported in the articles mentioned earlier. Back buffer clearing is turned off and on again so that the back buffer is cleared only when the default benchmark cameras show outer space. In free camera mode one can keep the camera outside the spaceship through the entire test, and see how the sky smearing is turned on and off.

3. A vertex shader used in game test 4 (M_HDRsky.vsh) is detected. In this case the driver adds two static clipping planes to reduce the workload. The clipping planes are placed so that the
sky is cut out just beyond what is visible in the default camera angles. Again, using the free camera one can look at the sky to see it abruptly cut off. Screenshot of this view was also
reported in the ExtremeTech and Beyond3D articles. This cheat was introduced in the 43.51 drivers as far as we know.

4. In game test 4, the water pixel shader (M_Water.psh) is detected. The driver uses this detection to artificially achieve a large performance boost - more than doubling the early
frame rate on some systems. In our inspection we noticed a difference in the rendering when compared either to the DirectX reference rasterizer or to those of other hardware. It appears
the water shader is being totally discarded and replaced with an alternative more efficient shader implemented in the drivers themselves. The drivers produce a similar looking rendering, but not an identical one.

5. In game test 4 there is detection of a pixel shader (m_HDRSky.psh). Again it appears the shader is being totally discarded and replaced with an alternative more efficient shader in a similar fashion to the water pixel shader above. The rendering looks similar, but it is not identical.

6. A vertex shader (G_MetalCubeLit.vsh) is detected in game test 1. Preventing this detection proved to reduce the frame rate with these drivers, but we have not yet determined the cause.

7. A vertex shader in game test 3 (G_PaintBaked.vsh) is detected, and preventing this detection drops the scores with these drivers. This cheat causes the back buffer clearing to be
disregarded; we are not yet aware of any other cheats.

8. The vertex and pixel shaders used in the 3DMark03 feature tests are also detected by the driver. When we prevented this detection, the performance dropped by more than a factor of
two in the 2.0 pixel shader test.

Now for some blitting from the back buffer.

Reply 48 of 55, by predator_085

User metadata
Rank Member
Rank
Member
shevalier wrote on 2026-02-06, 18:37:
predator_085 wrote on 2026-02-06, 18:24:

The Ultra series seems to be nea topic but they are too expensive for me

If you look at the comparison between Radeon and FX, the Ultra series is very similar to regular cards that were supposed to compete with ATI graphics cards.
At that time, we all had stripped-down FX graphics cards (not Ultra), which gave the impression that the FX series was useless.

Thanks for the info. Yes the history of FX series is really interesting. Especially the difference between ultra and non ultra fx cards.

I might be in interested in getting a high gf fx series card or ultra card in case they pop up as bareign but for time time being the geforce 4 4200 should fit into the category very good win98se graphics card.

I am somehow tempted to look for the absolute win98se gaming graphics card but this no need since the gf4 would fit very good category already.

Reply 49 of 55, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
predator_085 wrote on Yesterday, 16:35:
Thanks for the info. Yes the history of FX series is really interesting. Especially the difference between ultra and non ultra […]
Show full quote
shevalier wrote on 2026-02-06, 18:37:
predator_085 wrote on 2026-02-06, 18:24:

The Ultra series seems to be nea topic but they are too expensive for me

If you look at the comparison between Radeon and FX, the Ultra series is very similar to regular cards that were supposed to compete with ATI graphics cards.
At that time, we all had stripped-down FX graphics cards (not Ultra), which gave the impression that the FX series was useless.

Thanks for the info. Yes the history of FX series is really interesting. Especially the difference between ultra and non ultra fx cards.

I might be in interested in getting a high gf fx series card or ultra card in case they pop up as bareign but for time time being the geforce 4 4200 should fit into the category very good win98se graphics card.

I am somehow tempted to look for the absolute win98se gaming graphics card but this no need since the gf4 would fit very good category already.

Just for the sake of clarity, there isn't a huge technical difference between the FX series Ultra and non-Ultra cards. It is mainly that the Ultras were equipped with much higher speed memory, and generally had higher core clocks as well. If you look at the cards you'll see that the Ultras also tend to be much more robust looking with larger PCBs and more components. Probably to handle the higher power draw of the higher clock speeds.

If you look at the FX series portion of the nvidia GPU wiki, you'll see the differences between them:
https://en.wikipedia.org/wiki/List_of_N ... xx)_series
Notice that the FX 5600 and 5600 Ultra both use an NV31 chip and the core clocks aren't that much different, but the memory speed is much higher on the Ultra. The 2nd revision has even higher clocks.

Also, it is fairly common to find non-Ultra cards with lower clock speeds (or in the case of FX 5200\5500, even 64bit memory vs 128bit) than the specs listed online. They could reduce memory or core clocks to save costs and still call it the same card. On the other hand, I don't think it was common for Ultras to end up with slower memory or core clocks, so that also makes them a more reliably decent card.

Last edited by Ozzuneoj on 2026-02-09, 19:03. Edited 1 time in total.

Now for some blitting from the back buffer.

Reply 50 of 55, by bartonxp

User metadata
Rank Newbie
Rank
Newbie
predator_085 wrote on Yesterday, 16:35:

I am somehow tempted to look for the absolute win98se gaming graphics card but this no need since the gf4 would fit very good category already.

I think the best Win98 cards were the ATI X850XT Platinum and the GeForce 6800 GTX Ultra. These both had official driver support for Win98, and if I recall correctly these were the last of the super cards to have support.

Reply 51 of 55, by predator_085

User metadata
Rank Member
Rank
Member
bartonxp wrote on Yesterday, 18:52:
predator_085 wrote on Yesterday, 16:35:

I am somehow tempted to look for the absolute win98se gaming graphics card but this no need since the gf4 would fit very good category already.

I think the best Win98 cards were the ATI X850XT Platinum and the GeForce 6800 GTX Ultra. These both had official driver support for Win98, and if I recall correctly these were the last of the super cards to have support.

Thanks for the info. That's neat. I have never though that win98se is supported that long. The atix850xt an geforce 6800 gtx ultra are great cards but too for my tualtin celeron 1,3 mhz .

I feel that with my celeron I should not go beyond gf 5 or the radeon 9600 series.

And even with Even with these cards, my CPU should already be the bottleneck.

This is why the GeForce 4 or GeForce 5 feels like the sweet spot. They are already very powerful for Windows XP, but they allow you to turn on all the eye candy at the expense of frame rate.

Ok got it. Thx for the clarification

Ozzuneoj wrote on Yesterday, 18:47:
Just for the sake of clarity, there isn't a huge technical difference between the FX series Ultra and non-Ultra cards. It is mai […]
Show full quote
predator_085 wrote on Yesterday, 16:35:
Thanks for the info. Yes the history of FX series is really interesting. Especially the difference between ultra and non ultra […]
Show full quote
shevalier wrote on 2026-02-06, 18:37:

If you look at the comparison between Radeon and FX, the Ultra series is very similar to regular cards that were supposed to compete with ATI graphics cards.
At that time, we all had stripped-down FX graphics cards (not Ultra), which gave the impression that the FX series was useless.

Thanks for the info. Yes the history of FX series is really interesting. Especially the difference between ultra and non ultra fx cards.

I might be in interested in getting a high gf fx series card or ultra card in case they pop up as bareign but for time time being the geforce 4 4200 should fit into the category very good win98se graphics card.

I am somehow tempted to look for the absolute win98se gaming graphics card but this no need since the gf4 would fit very good category already.

Just for the sake of clarity, there isn't a huge technical difference between the FX series Ultra and non-Ultra cards. It is mainly that the Ultras were equipped with much higher speed memory, and generally had higher core clocks as well. If you look at the cards you'll see that the Ultras also tend to be much more robust looking with larger PCBs and more components. Probably to handle the higher power draw of the higher clock speeds.

If you look at the FX series portion of the nvidia GPU wiki, you'll see the differences between them:
https://en.wikipedia.org/wiki/List_of_N ... xx)_series
Notice that the FX 5600 and 5600 Ultra both use an NV31 chip and the core clocks aren't that much different, but the memory speed is much higher on the Ultra. The 2nd revision has even higher clocks.

Also, it is fairly common to find non-Ultra cards with lower clock speeds (or in the case of FX 5200\5500, even 64bit memory vs 128bit) than the specs listed online. They could reduce memory or core clocks to save costs and still call it the same card. On the other hand, I don't think it was common for Ultras to end up with slower memory or core clocks, so that also makes them a more reliably decent card.

Reply 52 of 55, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
bartonxp wrote on Yesterday, 18:52:

I think the best Win98 cards were the ATI X850XT Platinum and the GeForce 6800 GTX Ultra. These both had official driver support for Win98, and if I recall correctly these were the last of the super cards to have support.

There is no "best" GPU for Win98, just compromises between compatibility and performance.
No one should answer what GPU is best if he doesn't know games the one asking will be playing.

GF5 and earlier have all legacy functions build-in (like Table Fog, Palleted textures), while 9600 and later GeForce cards do not.
Depending then if you want to play 98 games that require/utilize those extra features - you should either pick older GF card (4 Ti/FX) if you want highest image quality/compatibility OR Radeon 9600+/GF6 if you want highest performance (for max. resolution/AA(AF) eye candy without extra legacy features).

Another thing to note is driver support, which is kind of crucial for really old games/DOS and in which GF6 and Radeon 9600 cards simply lack any kind of leeway (only late 98 drivers support them) - which can be problematic if any graphical glitches occur (due to driver bugs).

PS. To give you an idea how big of a gap can exist between Ultra and non-Ultra cards :
FX 5200 (full, 128-bit version) :

The attachment 3DMark 01SE.PNG is no longer available

FX 5200 Ultra :

The attachment 3DMark 01SE.PNG is no longer available

FX 5600 (slow version, but still 128-bit bus) :

The attachment 3DMark 01SE.PNG is no longer available

FX 5600 Ultra (350/700 a.k.a. "v1") :

The attachment 3DMark 01SE.PNG is no longer available

Reply 53 of 55, by douglar

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on Yesterday, 20:23:
There is no "best" GPU for Win98, just compromises between compatibility and performance. No one should answer what GPU is best […]
Show full quote
bartonxp wrote on Yesterday, 18:52:

I think the best Win98 cards were the ATI X850XT Platinum and the GeForce 6800 GTX Ultra. These both had official driver support for Win98, and if I recall correctly these were the last of the super cards to have support.

There is no "best" GPU for Win98, just compromises between compatibility and performance.
No one should answer what GPU is best if he doesn't know games the one asking will be playing.

GF5 and earlier have all legacy functions build-in (like Table Fog, Palleted textures), while 9600 and later GeForce cards do not.
Depending then if you want to play 98 games that require/utilize those extra features - you should either pick older GF card (4 Ti/FX) if you want highest image quality/compatibility OR Radeon 9600+/GF6 if you want highest performance (for max. resolution/AA(AF) eye candy without extra legacy features).

Another thing to note is driver support, which is kind of crucial for really old games/DOS and in which GF6 and Radeon 9600 cards simply lack any kind of leeway (only late 98 drivers support them) - which can be problematic if any graphical glitches occur (due to driver bugs).

PS. To give you an idea how big of a gap can exist between Ultra and non-Ultra cards :

Seems like the 3DMark 2001 scores for the FX series correlate pretty closely with the memory bandwidth.

Reply 54 of 55, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Math says, if everything else is the same - clock ratios are going to determine how big is the difference.
It only get's skewed by how much a core is getting bandwidth starved on slower memory (if core is weak, it may scale performance better with higher clock speed vs. VRAM clock).

Example :
1) GPU + 10% and VRAM + 10% = End performance + 10%
2) GPU + 5% and VRAM + 10% = End performance + x% (where x = value between 5 and 10, depending on how much starved is GPU itself during test).

Reply 55 of 55, by bartonxp

User metadata
Rank Newbie
Rank
Newbie
agent_x007 wrote on Yesterday, 21:47:
Math says, if everything else is the same - clock ratios are going to determine how big is the difference. It only get's skewed […]
Show full quote

Math says, if everything else is the same - clock ratios are going to determine how big is the difference.
It only get's skewed by how much a core is getting bandwidth starved on slower memory (if core is weak, it may scale performance better with higher clock speed vs. VRAM clock).

Example :
1) GPU + 10% and VRAM + 10% = End performance + 10%
2) GPU + 5% and VRAM + 10% = End performance + x% (where x = value between 5 and 10, depending on how much starved is GPU itself during test).

I think the best Win98 cards were the ATI X850XT Platinum and the GeForce 6800 GTX Ultra.

And it's funny you call this math, what a waste of time.