VOGONS


Reply 20 of 47, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

They were never positioned as GeForce 4 Ti replacement, just look at the pricing. Also MX 4000 is a cost reduced "refresh" of GeForce 4 MX with 64-bit memory bus and the crappiest memory possible to install, so it's in no way capable to beat any FX 5200 with 128-bit memory bus.

The 5200 & 5500 cards are the reason that the GeForce4 MX 4000 was still manufactured until 2006.

No, that's no the case. GeForce 4 MX 4000 had lower price than any GeForce FX card. I suspect that there was a demand for very cheap videocards in Asia which were capable to decode DVD.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 21 of 47, by predator_085

User metadata
Rank Member
Rank
Member
Ozzuneoj wrote on 2026-02-04, 13:54:
Yeah, for your system a higher end FX series would not really make sense, especially with the high prices they go for these days […]
Show full quote
predator_085 wrote on 2026-02-04, 06:56:
Thanks for your reply. This is interesting. The p4 is way more powerful than tualatin cerleron and the fx 5500 is still scorin […]
Show full quote
RetroPCCupboard wrote on 2026-02-03, 20:00:

In my tests (on 3ghz Pentium 4), even a Geforce 2 Ti scored higher than the fX 5500 at 3DMark 2000. The GF4 MX 460 scored even higher than that.

Thanks for your reply. This is interesting. The p4 is way more powerful than tualatin cerleron and the fx 5500 is still scoring lower than an older card.

@all I have done some further reserach about the ultra card and the highter fx sereis and while I consdiering as very interesting I have to admit that are over the budget I want to spend.

I will keep the gf4 as long as it works.

Would do it do any good to max out the possible ram for win98se. As of now I am using 256m PC133 ram.

Yeah, for your system a higher end FX series would not really make sense, especially with the high prices they go for these days. That's a good choice. 😀

256MB of RAM is plenty for Windows 98SE, generally, but it really depends what games you are running on it. If you are using the system to run games from 2001-2003 then upgrading to 512MB would definitely be recommended, though realistically at that point the CPU and memory bandwidth will start to be a limiting factor in heavier games. If you're only playing games from 2000 and earlier then going to more than 256MB is unlikely to make much of a difference... but you could just get the memory anyway since it is a fairly cheap upgrade. Just doing bother going over 512MB, since Windows 98SE tends to have issues at that point.

EDIT: Yeah, what douglar said. 😀

Thanks for the reply and the advice. The main timeframe of us for my system is from 1997 to early 2001. From later 2001 onwards I want to gift myself with a pure winxp system. Either a fast athlon xp or pentium 4 system. For my win98se gaming on the asus tusl 256mb served me well. I have no real reason to upgrade to 512mb. Was just curious if might be worth considering or not.

[=douglar post_id=1404036 time=1770212338 user_id=42154]

predator_085 wrote on 2026-02-04, 06:56:

Would do it do any good to max out the possible ram for win98se. As of now I am using 256m PC133 ram.

No, that is unlikely to help anything in Windows 98se unless you have a specific work load that 1) requires more than 256MB ram and 2) runs on Windows 98.

If you do add more ram, probably best to not go any higher than 512MB RAM. There are reports of stability issues in some situations if you go over 512MB RAM. And safe mode might stop working.
[/quote]

Thanks for the advice as well. yes I can rember reading quit often that 512mb is max amount of ram suitable for win98se. Like mentioned above I am not dead set on the ram upgrad was just curious if would be worth considering if I want to max out my mainboard.

Last edited by predator_085 on 2026-02-04, 18:28. Edited 1 time in total.

Reply 22 of 47, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
predator_085 wrote on 2026-02-04, 18:03:

Thanks for the reply and the advice. The main timeframe of us for my system is from 1997 to early 2001. From later 2001 onwards I want to gift myself with a pure winxp system. Either a fast athlon xp or pentium 4 system. For my win98se gaming on the asus tusl 256mb served me well. I have no real reason to upgrade to 512mb. Was just curious if might be worth considering or not.

If you're making a dedicated Windows XP system, I would go for something a bit faster if possible. A fast Athlon XP would be a big improvement over your Celeron 1.3 + SDRAM, but in XP's 13 year lifespan, these would still represent the lower end of XP-compatible systems. An Athlon 64 or Athlon 64 X2 would offer a bit more performance if you want to stick with AGP cards. Plan on spending a bunch of money to get an AGP GPU that is substantially faster than your Ti 4200 though. High end AGP cards were really only made for 3-4 more years after the Geforce4 Ti series aside from a few outliers, and they are all going to be quite pricey while also at times having some compatibility issues due to using PCI-E to AGP bridge chips.

If you're okay going to a PCI-Express GPU then the sky is the limit... you could get a fairly boring but massively powerful XP system by just picking up a dirt cheap Dell or Lenovo workstation equipped with a 2nd or 3rd Gen Intel Core i5 or i7 and tossing in whatever video card you can get your hands on that fits in the system and has XP drivers. Or, if you want to sacrifice some CPU power for a bit more "retro" aesthetic you could put together a fancy Core 2 Duo\Quad system. Either will be orders of magnitude faster than an Athlon XP or Pentium 4 and game compatibility shouldn't be much different since you already have a system to cover pre-2001 games.

There is something to be said for running every XP game at the absolute maximum settings, completely smoothly and doing it with less noise, heat and power consumption than systems a fraction of the speed. 😀

Now for some blitting from the back buffer.

Reply 23 of 47, by douglar

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2026-02-04, 16:16:

They were never positioned as GeForce 4 Ti replacement, just look at the pricing. Also MX 4000 is a cost reduced "refresh" of GeForce 4 MX with 64-bit memory bus and the crappiest memory possible to install, so it's in no way capable to beat any FX 5200 with 128-bit memory bus.

The 5200 & 5500 cards are the reason that the GeForce4 MX 4000 was still manufactured until 2006.

No, that's no the case. GeForce 4 MX 4000 had lower price than any GeForce FX card. I suspect that there was a demand for very cheap videocards in Asia which were capable to decode DVD.

The 4200 was introduced at the next higher price point when it came out, yes. Still, the 4200 vs 5200 naming convention carries certain unspoken connotations and begs for comparison.

I should have stated what clock speed I was using when I compared the MX4000 with the FX5200. and I should have mentioned that I was using the more commonly available 64bit FX5200. Sorry about the confusion.

The attachment By System.jpg is no longer available

The FX5200 does start to pull ahead when the clock speed goes over 800Mhz.

Edit - After looking at those numbers, I gotta think that my PNY card might actually have a 128bit bus or something. I'll retest and some point. The 3dMark 2000 scores look higher than expected

Reply 24 of 47, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie

Usually the FX5600 series cards are much cheaper than the Ti4200 so if you were to look for one without having the other, the FX5600 would be a more economical choice. But if you have one, there aren't too many reasons to go for the other and the reasons you have are mostly edge cases.

Going higher the FX series pick up on price fast and anything above the 5700 costs a premium. The 5700 itself is a good graphics card for Windows 98, but the FX series cards came in two batches (?) and the 5700 is from the second one. It demands more recent drivers on Windows 98 and has a lot more issues because of it.

sreq.png retrogamer-s.png

Reply 25 of 47, by bartonxp

User metadata
Rank Newbie
Rank
Newbie

Without adding more witchery to the discussion, it's my opinion that if you were to pursue an upgrade then a FX 5950, 5900 or 5800 would be the only cards worthy of the change. The 5600U would be good too but I fear it wouldn't be enough of an improvement to satisfiy you longterm. The FX is considered to be one of the best series for Win98 gaming and prior, albeit heavily opinion driven, so only with one of the aforementioned high-end cards would you be led to happiness in the longterm, otherwise, the 4200 is good enough, IMHO.

Reply 26 of 47, by predator_085

User metadata
Rank Member
Rank
Member
Ozzuneoj wrote on 2026-02-04, 18:17:
If you're making a dedicated Windows XP system, I would go for something a bit faster if possible. A fast Athlon XP would be a b […]
Show full quote
predator_085 wrote on 2026-02-04, 18:03:

Thanks for the reply and the advice. The main timeframe of us for my system is from 1997 to early 2001. From later 2001 onwards I want to gift myself with a pure winxp system. Either a fast athlon xp or pentium 4 system. For my win98se gaming on the asus tusl 256mb served me well. I have no real reason to upgrade to 512mb. Was just curious if might be worth considering or not.

If you're making a dedicated Windows XP system, I would go for something a bit faster if possible. A fast Athlon XP would be a big improvement over your Celeron 1.3 + SDRAM, but in XP's 13 year lifespan, these would still represent the lower end of XP-compatible systems. An Athlon 64 or Athlon 64 X2 would offer a bit more performance if you want to stick with AGP cards. Plan on spending a bunch of money to get an AGP GPU that is substantially faster than your Ti 4200 though. High end AGP cards were really only made for 3-4 more years after the Geforce4 Ti series aside from a few outliers, and they are all going to be quite pricey while also at times having some compatibility issues due to using PCI-E to AGP bridge chips.

If you're okay going to a PCI-Express GPU then the sky is the limit... you could get a fairly boring but massively powerful XP system by just picking up a dirt cheap Dell or Lenovo workstation equipped with a 2nd or 3rd Gen Intel Core i5 or i7 and tossing in whatever video card you can get your hands on that fits in the system and has XP drivers. Or, if you want to sacrifice some CPU power for a bit more "retro" aesthetic you could put together a fancy Core 2 Duo\Quad system. Either will be orders of magnitude faster than an Athlon XP or Pentium 4 and game compatibility shouldn't be much different since you already have a system to cover pre-2001 games.

There is something to be said for running every XP game at the absolute maximum settings, completely smoothly and doing it with less noise, heat and power consumption than systems a fraction of the speed. 😀

thanks for the advice. Yeah there are many great posibilties to get great xp machine. I can go with the motive the faster the better like fast core2duo or even better because for anything pre 2001 I have my socket 370 system.

bartonxp wrote on 2026-02-04, 20:43:

Without adding more witchery to the discussion, it's my opinion that if you were to pursue an upgrade then a FX 5950, 5900 or 5800 would be the only cards worthy of the change. The 5600U would be good too but I fear it wouldn't be enough of an improvement to satisfiy you longterm. The FX is considered to be one of the best series for Win98 gaming and prior, albeit heavily opinion driven, so only with one of the aforementioned high-end cards would you be led to happiness in the longterm, otherwise, the 4200 is good enough, IMHO.

Yeah this makes sense but like I said the realg ood Fx cardsa are beyond my budget. Without the prospect of any noticable difference I will rather stick the ti 4200 which is great card I cannot say anything bad about it.

Reply 27 of 47, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie

Actually, FX cards have a few tricks up their sleeve. FX 5600 can outperform the Ti 4200 when using Anisotropic Filtering for example, as the FX series takes a much lower hit in performance when using AA or AF and the performance is often very acceptable.

Reply 28 of 47, by Ydee

User metadata
Rank Oldbie
Rank
Oldbie
douglar wrote on 2026-02-04, 16:07:
They were positioned as a successor to the Geforce 4200 and promised new features, but in practice they were: […]
Show full quote
Ydee wrote on 2026-02-04, 15:50:
douglar wrote on 2026-02-04, 15:37:

The 5200 & 5500 cards are the reason that the GeForce4 MX 4000 was still manufactured until 2006.

What made these GPUs (FX 5200/5500) such a desperate mistake?

They were positioned as a successor to the Geforce 4200 and promised new features, but in practice they were:

  1. Too slow to run most games that used DirectX9 features in DirectX 9 mode, so the new features were inaccessible
  2. Significantly slower than the the Geforce 4200 in DirectX 8 games
  3. Frequently slower than the Geforce 4 MX 4000 in DirectX 7 games

However for retro builds with AGP x2 systems today, the FX5200 cards often offer a good price/performance ratio if you just want directX 7 and maybe a little direct X 8.

There is the plus that the 5200 series still supported 2d GUI acceleration, Win98, native table fog support, had an MPEG-2 hardware decoder, and was pretty durable.

Thank you for answer, I know that for DX9 is FX 5200-5500 lost case, but for the older games, I assumed he would be roughly on par with GF4Ti 4200, if not slightly better.
I didn't know that NV34 only got half the TMU compared to NV25 (GF4Ti).

From this it seems to me that the FX5200-5500 was situated as the successor to the GF4MX series, although the numbering of the 5200, 5500 is more conducive to joining the Ti4200 series.

Reply 29 of 47, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
Garrett W wrote on 2026-02-05, 08:39:

Actually, FX cards have a few tricks up their sleeve. FX 5600 can outperform the Ti 4200 when using Anisotropic Filtering for example, as the FX series takes a much lower hit in performance when using AA or AF and the performance is often very acceptable.

Let's test this 😁

Anisotropic Filtering (@ x8) :
3DMark 01SE (AF forced through driver) :
"4200 Ti 128MB" (emulated from 4800 SE - clocks : 250/550) :

The attachment 3DMark 01SE AF x8.PNG is no longer available

5600 Ultra v2 (clocks : 400/800) :

The attachment 3DMark 01SE AF x8.PNG is no longer available

Summary : 6535 (GF4 Ti) vs. 11620 (FX 5600U v2)
For default 3DMark 01SE : 5600U v2 scores ~14.6k points, while emulated 4200 Ti get's ~15.2k.
FX 5600U is dropping to ~0.79 times of default performance (or retains ~79% of default settings speed),
Ti 4200 128MB (emulated) is dropping to ~0.43 times it's stock performance (or retains 43% of default settings speed).
In practice : 30-60% higher clocked card (depending on checking GPU core or VRAM difference) is 77% faster.
So, a ~20% lower clock 5600 non-U should still be around 50% faster with AF x8 than 4200... assuming, 5600 can actually still get playable framerates with AF 8x (80% of no AF performance is still a drop).

5600 non-U might get 20-30% higher performance after enabling AF 8x mode vs. 4200.

Let's try other DX8 benchmark too :
Codecreatures AF x8 (forced through driver) :
"4200 Ti 128MB" (emulated from 4800 SE - clocks : 250/550) :

The attachment Codecreatures AF x8.PNG is no longer available

5600 Ultra v2 (clocks : 400/800) :

The attachment Codecreatures AF x8.PNG is no longer available

Summary : 1935 (GF4 Ti) vs. 2665 (FX 5600U v2)
Default scores : 5600U v2 get's ~4169 points, while emulated 4200 Ti 128MB scores 4238 points.
5600U v2 = 0.64 of it's default performance (retains 64%), Ti 4200 128MB = 0.46 of it's default performance (retains 46%).
So in practice, a regular 5600 (if clocked 20% lower than Ultra version : 320/640), is simply going to get only ~5% better performance than Ti 4200 under AF x8 (this difference will get lower if VRAM is clocked even lower than 20% on 5600 non-U).
550 clock on memory for my 4200 gives it a bit of an edge vs. default 4200 - however, if you already own a 4200 a 10% OC on both VRAM and Core shouldn't be that big of an issue (which will make that kind of card a bit faster vs. what I shown here).

Is NV31 more effective in AF ?
Yes.
Does it make non-U 5600s cards worth vs. 4200 ?
Only if performance without AF of 5600 non-U is no less than ~70% of standard Ti 4200 (note : you are forced to run AF all the time, since otherwise 4200 will be better choice - it offers 5600U-like performance after all).
If 4200 is over 50% faster without AF than non-U 5600 series card :
It's safe to assume 4200 will give you similar performance after enabling AF x8 mode.
Warning : ^ALL of the above depends on use case/game engine

Last edited by agent_x007 on 2026-02-06, 00:34. Edited 3 times in total.

Reply 30 of 47, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Also same thing for AA (this time set through application setting) :
FX 5600U v2 400/800 :
3DMark 01 SE :

The attachment 3DMark 01SE AA x4.PNG is no longer available

Codecreatures :

The attachment Codecreatures AA x4.PNG is no longer available

GF4 Ti 4200 128MB (emulated 250/550) :
3DMark 01 SE :

The attachment 3DMark 01SE AA x4.PNG is no longer available

Codecreatures :

The attachment Codecreatures AA x4.PNG is no longer available

Performance retained vs. no AA:

5600U v2 (400/800) :
3DMark 01 SE : 0.57 (57%)
Codecreatures : 0.73 (73%)

4200 Ti 128MB (emulated 250/550) :
3DMark 01 SE : 0.4 (~40%)
Codecreatures : 0.66 (~66%)

Since both programs get similar scaling (17% and 7% between different generations), let's be generous and call NV31 20% more efficient in pure AA (since I can't know how games behave).

Based on above results, this means a 5600 non-U that clocked lower than 80% of Ultra option I tested (so, it's slower than 320/640 on either GPU or VRAM clock) - WILL be slower than 4200 I tested under 4x AA setting.
As far as I know, 640 effective VRAM clock is pretty high for non-Ultra FX 5600s cards, so - I think it's pretty safe to assume there are no 5600s that can (without OC) be faster than regular 4200 under max AA mode.

Note : 640 effective VRAM clock requires at least 3ns speed bin of memory chips used on 5600 non-U card (or the same class of VRAM as GF4 Ti 4600/4800 non-SE cards).

Last edited by agent_x007 on 2026-02-06, 00:29. Edited 1 time in total.

Reply 31 of 47, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

To be thorough, here are stock results (no comment) :
Ti 4200 128MB (emulated, 250/550) :
3DMark 01SE (no AA/no AF) :

The attachment 3DMark 01SE.PNG is no longer available

Codecreatures (no AA/no AF) :

The attachment Codecreatures.PNG is no longer available

FX 5600 Ultra v2 (400/800) :
3DMark 01SE (no AA/no AF) :

The attachment 3DMark 01SE.PNG is no longer available

Codecreatures (no AA/no AF) :

The attachment Codecreatures.PNG is no longer available

Reply 32 of 47, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on 2026-02-05, 23:53:
Let's test this :D […]
Show full quote
Garrett W wrote on 2026-02-05, 08:39:

Actually, FX cards have a few tricks up their sleeve. FX 5600 can outperform the Ti 4200 when using Anisotropic Filtering for example, as the FX series takes a much lower hit in performance when using AA or AF and the performance is often very acceptable.

Let's test this 😁

Anisotropic Filtering (@ x8) :
3DMark 01SE (AF forced through driver) :
"4200 Ti 128MB" (emulated from 4800 SE - clocks : 250/550) :

The attachment 3DMark 01SE AF x8.PNG is no longer available

5600 Ultra v2 (clocks : 400/800) :

The attachment 3DMark 01SE AF x8.PNG is no longer available

Summary : 6535 (GF4 Ti) vs. 11620 (FX 5600U v2)
For default 3DMark 01SE : 5600U v2 scores ~14.6k points, while emulated 4200 Ti get's ~15.2k.
FX 5600U is dropping to ~0.79 times of default performance (or retains ~79% of default settings speed),
Ti 4200 128MB (emulated) is dropping to ~0.43 times it's stock performance (or retains 43% of default settings speed).
In practice : 30-60% higher clocked card (depending on checking GPU core or VRAM difference) is 77% faster.
So, a ~20% lower clock 5600 non-U should still be around 50% faster with AF x8 than 4200... assuming, 5600 can actually still get playable framerates with AF 8x (80% of no AF performance is still a drop).

5600 non-U might get 20-30% higher performance after enabling AF 8x mode vs. 4200.

Let's try other DX8 benchmark too :
Codecreatures AF x8 (forced through driver) :
"4200 Ti 128MB" (emulated from 4800 SE - clocks : 250/550) :

The attachment Codecreatures AF x8.PNG is no longer available

5600 Ultra v2 (clocks : 400/800) :

The attachment Codecreatures AF x8.PNG is no longer available

Summary : 1935 (GF4 Ti) vs. 2665 (FX 5600U v2)
Default scores : 5600U v2 get's ~4169 points, while emulated 4200 Ti 128MB scores 4238 points.
5600U v2 = 0.64 of it's default performance (retains 64%), Ti 4200 128MB = 0.46 of it's default performance (retains 46%).
So in practice, a regular 5600 (if clocked 20% lower than Ultra version : 320/640), is simply going to get only ~5% better performance than Ti 4200 under AF x8 (this difference will get lower if VRAM is clocked even lower than 20% on 5600 non-U).
550 clock on memory for my 4200 gives it a bit of an edge vs. default 4200 - however, if you already own a 4200 a 10% OC on both VRAM and Core shouldn't be that big of an issue (which will make that kind of card a bit faster vs. what I shown here).

Is NV31 more effective in AF ?
Yes.
Does it make non-U 5600s cards worth vs. 4200 ?
Only if performance without AF of 5600 non-U is no less than ~70% of standard Ti 4200 (note : you are forced to run AF all the time, since otherwise 4200 will be better choice - it offers 5600U-like performance after all).
If 4200 is over 50% faster without AF than non-U 5600 series card :
It's safe to assume 4200 will give you similar performance after enabling AF x8 mode.
Warning : ^ALL of the above depends on use case/game engine

That it some good data to have, thank you for taking the time to test this. It is worth noting though, that there can be a pretty huge difference from game to game, and synthetic benchmark performance at this time was all over the place due to driver optimizations. It would be interesting to see comparisons between the cards in some games from the time.

I seem to remember Battlefield 1942 being significantly faster on the Ti 4400 compared to a 5600 vanilla, and I'm almost certain I was using anisotropic filtering at that time. I was probably not using anti-aliasing though. I vaguely recall some games that used older engines or used OpenGL being more favorable to the GF4 Ti series as well. This was almost 25 years ago though, so... yeah. I remember being very disappointed in the 5600 when they sent that to me, but the specifics are surely foggy after all these years. 🙂

Lastly, do you know what your anisotropic filtering quality\performance setting was set to? According to this article there can be a dramatic difference in performance between the "balanced" and "aggressive" settings. I don't remember if an anisotropic filtering quality or performance setting existed for the Geforce 4 Ti, and I don't know which levels had similar image quality between the cards.

Now for some blitting from the back buffer.

Reply 33 of 47, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie

nice one agent_x007!

Reply 34 of 47, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on Yesterday, 01:02:

Lastly, do you know what your anisotropic filtering quality\performance setting was set to? According to this article there can be a dramatic difference in performance between the "balanced" and "aggressive" settings. I don't remember if an anisotropic filtering quality or performance setting existed for the Geforce 4 Ti, and I don't know which levels had similar image quality between the cards.

Well, there is no such setting in NV control panel in driver I'm using (71.84).
Only settings related to AF in my driver is this :

The attachment NV31 af lod.PNG is no longer available

However, based on numbers I got - I'm pretty sure this was "balanced" mode.

Reply 35 of 47, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Turn the filtering and sample settings off and on and make us a visual and performance comparison. 😁 They are part of why AF doesn't kill performance on newer cards. There are also trilinear filtering optimizations.

Negative LOD bias is another possible optimization. I think "allow" lets games increase texture detail whereas "clamp" does not.

Last edited by swaaye on 2026-02-06, 17:57. Edited 2 times in total.

Reply 36 of 47, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote on Yesterday, 17:13:
Well, there is no such setting in NV control panel in driver I'm using (71.84). Only settings related to AF in my driver is this […]
Show full quote
Ozzuneoj wrote on Yesterday, 01:02:

Lastly, do you know what your anisotropic filtering quality\performance setting was set to? According to this article there can be a dramatic difference in performance between the "balanced" and "aggressive" settings. I don't remember if an anisotropic filtering quality or performance setting existed for the Geforce 4 Ti, and I don't know which levels had similar image quality between the cards.

Well, there is no such setting in NV control panel in driver I'm using (71.84).
Only settings related to AF in my driver is this :

The attachment NV31 af lod.PNG is no longer available

However, based on numbers I got - I'm pretty sure this was "balanced" mode.

Okay, thanks for checking. The review I linked to was using the 42.63 beta driver, which is a lot older than the one you're using. I feel like the slider should be there somewhere though since a similar performance\quality slider still exists in modern drivers. This is the screenshot from the link:

The attachment gfx-b_driver2.gif is no longer available

Maybe you have to uncheck the advanced options box to see the slider. It may just control the two anisotropic filtering optimizations settings shown in your screenshot... not sure.

Regardless of the setting, I'd be curious to see how the filtering quality compares between the FX 5600 Ultra and a Geforce4 Ti given the huge performance difference. There was so much goofy stuff going on back then with the filtering optimizations... I remember some cards clearly not applying AF at certain angles to save performance. I think that may have been the Radeon 8500 series... can't remember exactly.

Oh... I found a thread about it after I typed that. Heh...
Texture filtering quality comparison

So, it seems like maybe the FX and Ti series are the same in high quality mode. Not sure how they compare at different settings.

EDIT: Here is the D3D AF Tester shown in some of the screenshots in that thread...
https://www.3dcenter.org/download/d3d-af-tester
I'll also attach it here for posterity.

And, to be clear, I'm not posting this here demanding that you take more of your time to do these tests. If you have time and are interested, that would be awesome but absolutely not required or expected 🙂. I will probably do some of my own testing when I get some time because the comparison between the FX 5600 and Ti 4400 in particular was a big deal for me back then, and I believe I have examples of both cards on hand (though not my originals, sadly).

Last edited by Ozzuneoj on 2026-02-06, 18:02. Edited 1 time in total.

Now for some blitting from the back buffer.

Reply 37 of 47, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah, Radeon 7000-9250 have extreme angle dependency with AF. Like run down a corridor and notice the walls are blurry but the floor is sharp. And they also don't support trilinear filtering with AF. Radeon 9500/9700 massively overhauled texturing quality.

I think ATI probably inspired NVidia to aggressively optimize their filtering too.

Reply 38 of 47, by shevalier

User metadata
Rank Oldbie
Rank
Oldbie

In my personal experience, the series of FX graphics cards that do not work synchronously with memory (core = memory) are a waste of money.
Those that work synchronously (all Ultra`s) show much more interesting test results than the rest.
In other words, the results of the Ultra`s cannot be projected onto other cards in this series.

That is, according to the logic of the names, Ultra = regular implementation, and everything else = SE (Cutted edition).
Why nVidia chose these particular names for its products at that time — who knows?

Aopen MX3S, PIII-S Tualatin 1133, Radeon 9800Pro@XT BIOS, Audigy 4 SB0610
JetWay K8T8AS, Athlon DH-E6 3000+, Radeon HD2600Pro AGP, Audigy 2 Value SB0400
Gigabyte Ga-k8n51gmf, Turion64 ML-30@2.2GHz , Radeon X800GTO PL16, Diamond monster sound MX300

Reply 39 of 47, by predator_085

User metadata
Rank Member
Rank
Member

Thanks a lot for posting the further benmarks @agent_x007. Interesting stuff.

The Ultra series seems to be nea topic but they are too expensive for me