VOGONS


upgrade from gf4 to gf fx rational or waste of money and time.

Topic actions

Reply 80 of 83, by shevalier

User metadata
Rank Oldbie
Rank
Oldbie
havli wrote on 2026-02-11, 18:13:
shevalier wrote on 2026-02-11, 11:46:

As I mentioned above, not only have I never owned a 59x0 graphics card, but I have never even held one in my hands.
I showed the results for the 5200, and the 5700 behaves exactly the same way.
Your results may be related to both the differences in memory width (256 vs. 128) and the fact that the memory frequencies are so high that they are simply sufficient.

Well, I am watching your FX 5500 results now... and still can't see why 1:1 ratio would be that significant.
All I see here is that FX 5500 (and possibly 5600/5700) are significantly bottlenecked by memory frequency. So when you increase memory speed, performance raises also. But whether you run GPU/MEM exactly 300/300 MHz or 300/290, there will be very little difference - less than 3% most likely. Or 300/310 for that matter - should be faster than 300/300 by small amount. In short - faster memory = better for these low/mid FX cards... but there are no golden ratios I dare to say.

Perhaps.
Or perhaps not.
We need to ask agent_x007 to conduct some experiments; he has an indecent number of FX Ultra graphics cards. 😀
On my ‘office-level’ FX5500, I can come to conclusions that will make everyone cry.

Aopen MX3S, PIII-S Tualatin 1133, Radeon 9800Pro@XT BIOS, Audigy 4 SB0610
JetWay K8T8AS, Athlon DH-E6 3000+, Radeon HD2600Pro AGP, Audigy 2 Value SB0400
Gigabyte Ga-k8n51gmf, Turion64 ML-30@2.2GHz , Radeon X800GTO PL16, Diamond monster sound MX300

Reply 81 of 83, by vintageonthemoon

User metadata
Rank Newbie
Rank
Newbie
predator_085 wrote on 2026-02-12, 07:22:
Thanks for your detailed reply and the warning about the potential drawbacks of more powerful cards. […]
Show full quote
vintageonthemoon wrote on 2026-02-11, 20:36:
it's really depends on what you're looking for, for win 98 build both gf4 and FX cards (later ones) are excellent for that build […]
Show full quote
predator_085 wrote on 2026-02-03, 11:18:

All in All I am happy with my asus tusl2-c mainboard with a tualtin celeron 1,3 mhz cpu running with geforce 4 4200. I am quite happy with the results. But i am consodering the max the system with getting a tulatin pentium 3 and maybe a better card.

Which brings me to the question if upgrading from gf 4 to gf fx would do anything for my mainboard/chip in the win98se gaming realm or is the extra power usefl for win98se. I coul get gf fx 5500 at a decent price.

it's really depends on what you're looking for, for win 98 build both gf4 and FX cards (later ones) are excellent for that build, only downside is the FX cards is the performance, the shader and texture quality is reduced by comparison to the GF4 TI 4xxx, the FX 5200 128-bit version is not terrible, it's on-par with geforce 2 gts performance, the 5500 is just an overclocked 5200, 5600 ultra is closer to basic geforce 4 ti 4200 levels, honestly it's nice to have mid-range card stock clock-speed then go with more powerful card that tends to die quicker due to more power and extra heat, with 20+ year old cards, the age can be a big problem for these type cards: vram chips can get corrupted (bit-rot), ageing/failing capacitors and fan/heatsink need to be properly cleaned or replaced. i already replaced 3 cards heatsink/fans and some recapping.

I also have a p3 Tualatin rig, with P3 1.26ghz, 512mb of sdram, that I’m very happy with it, and geforce 4 ti 4200 128-bit is perfect for that rig, very good support for older titles and DOS using Direct X 8.2 due to some programs (like daemon tools) don't run very well under DX 7 and runs very smoothly using force-ware 31.40 drivers, the 45.23 drivers is a little finicky under win 98.
the only real use for the FX cards in my opinion is for use of DX9 (despite how poorly it was handeld) for n-glide and dgvoodoo support you can play games with glide support and to save you a lot of headache and money on getting expensive 3dfx voodoo cards.

also ATI Radeon 9xxx cards are also excellent and much cheaper by comparison to the geforce ti cards, I have 9200, 9500, 9600xt (the 9600xt is my backup card in case my gf4 card is failing), aside from lacking legacy features like table fog and 8-bit palletized textures, there very good for win98, also have dx9 support, some games actually look and play better on radeon then geforce, the image quality on those radeon cards is much sharper and cleaner then the geforce.

Thanks for your detailed reply and the warning about the potential drawbacks of more powerful cards.

What I want is rather easy. I want to play win98se games with high settings in 800x600 or 1024x768 in high settings

In that regard the ti 4200 is good but I wanted to find out if I could max out my system even more.

My main field of use is directx8. For dx 9 games my tulatin celeron is not ideal.

I also have no use for dvgoodoo. I have a real voodoo rig. A coppermine p3 with a voodoo 3 2000agp.

well it is not that bad

bartonxp wrote on 2026-02-12, 01:27:

I doubt he/she/they are coming back to their thread. It's been VOGONized!!!!

here I am. But you have a point. The thread moved miles away from the original topic. But never mind. The question has been answered on the first page already. Like often in the retro field th answer is it depends. The FX cards might have some use but if I want to super rig or win98se I need to move more modern rigs.

cool, i do know some versions of the gf4ti4200 are overclock friendly like "gainward" and "msi" ones, but due to age i rather keep them in stock speed, since most of the games on win98 that i play run over 50-60fps anyway, i don't need crazy speeds, i have a gainward branded one. and about the setting and dx 8, i 100% agree, i use those settings too.

Reply 82 of 83, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
shevalier wrote on 2026-02-12, 10:35:
Perhaps. Or perhaps not. We need to ask agent_x007 to conduct some experiments; he has an indecent number of FX Ultra graphics c […]
Show full quote

Perhaps.
Or perhaps not.
We need to ask agent_x007 to conduct some experiments; he has an indecent number of FX Ultra graphics cards. 😀
On my ‘office-level’ FX5500, I can come to conclusions that will make everyone cry.

I agree with havli : There is no "extra boost" by just being 1:1.
You simply get extra performance from additional bandwidth (adjusted to data starvation level of GPU + limited by performance of your platform).

Not sure how I can test this to satisfy your curiosity though ?
Just run memory at 200MHz or 125MHz, and then again at 250MHz with the same GPU clock.
If higher score at 250 is bigger in % than % difference between memory clocks (and beyond what measuring error allows) - I'd say you confirmed "extra boost for being 1:1".

Reply 83 of 83, by appiah4

User metadata
Rank l33t++
Rank
l33t++
agent_x007 wrote on 2026-02-12, 19:10:
I agree with havli : There is no "extra boost" by just being 1:1. You simply get extra performance from additional bandwidth (a […]
Show full quote
shevalier wrote on 2026-02-12, 10:35:
Perhaps. Or perhaps not. We need to ask agent_x007 to conduct some experiments; he has an indecent number of FX Ultra graphics c […]
Show full quote

Perhaps.
Or perhaps not.
We need to ask agent_x007 to conduct some experiments; he has an indecent number of FX Ultra graphics cards. 😀
On my ‘office-level’ FX5500, I can come to conclusions that will make everyone cry.

I agree with havli : There is no "extra boost" by just being 1:1.
You simply get extra performance from additional bandwidth (adjusted to data starvation level of GPU + limited by performance of your platform).

Not sure how I can test this to satisfy your curiosity though ?
Just run memory at 200MHz or 125MHz, and then again at 250MHz with the same GPU clock.
If higher score at 250 is bigger in % than % difference between memory clocks (and beyond what measuring error allows) - I'd say you confirmed "extra boost for being 1:1".

You still wouldn't be sure that the extra kick is not from being bandwidth starved in the first place. You would ideally have to test it performance at several steps below and above 1:1, then build two regression models: One where RAM speed is the only variable, and the other where RAM speed and RAM:GPU speed ratio is the other. If you can't prove that there is a statistically significant difference between the two models, that means the 1:1 boost is placebo.