VOGONS


First post, by Mondodimotori

User metadata
Rank Member
Rank
Member

Hello people, I have a small doubt.
I'm looking to get an FX5600 for a new Win9x system I'm building around Socket A (I'll make a thread about it when I have all the parts).
I've decided to go for this card because it can run on 45.23 drivers and, currently, is slightly cheaper than a Ti4200 (of wich I alredy have one, but it's on its way out...).

I've put my eyes on several listings and one thing I can't wrap my mind around are those with screenshots of either hwInfo and/or GPU-Z (sometime even 3Dmark 2001 scores around 8800 or 9000), used to show both the card working and it's actual specs.
And here's the thing: Lots of these pictures reports weird clock values. All of them report the card to be a stock 5600, not the XT version nor the ultra (those ultra are hella expensive), but the clocks are off. Sometimes just on the GPU side, sometimes on both the GPU and the memory. And by off I don't mean pumped up (like some manufacturer could do), I mean lower than nvidia reference, wich should be 325mhz on the core and 250 on the ram (500mhz DDR effective). Like 250 both on GPU and memory, or even barely 250 on the GPU and 200 on the memory.

People that own a similar card: is this normal? Like, some vendors did downclock both their cores and memory? Or are these fake cards juiced up, like 5600XT with bioses changed to report they are 5600s?

Resources on these cards from back in the days are pretty lackluster, and I can't wrap my heand around all the vendors that manufactured these cards, sometimes with different cooler designs and even PCB designs from the same vendor.

PS
Also, if you were to chose, would you get one with electrolitic capacitors or polimer ones?

Reply 1 of 9, by Thandor

User metadata
Rank Member
Rank
Member

Usually GPU-z and the nVidia drivers (run Coolbits for the overclocking tab) report the correct information with these cards.I think the original GeForce FX 5600's are fairly close to each other in terms of clocks. My XFX GeForce FX 5600 runs at 325/275 but several brands indeed use 325/250 as you mention. Of course there are exceptions like the Albatron GeForce FX5600EP that utilises a ghastly slow 64-bit memory bus.

Once you obtain one try benchmarking it yourself and compare results. Framerates don't lie 😉 (ahum, if I remember correctly the 44.03 driver that came on my driver CD of my XFX card contained "optimalizations" that increased the framerates of the 3DMark2001's Nature Demo... good times).

When you dive into the GeForce FX5600XT it's going to be a mess. These cards came later, became budget (regular 5600 was main-stream at launch) and became ugly as far as differences in clock frequencies and bus bandwidth's! Especially the 64-bit versions are terrible.

[edit] I've found an old post of myself mentioning I've got about 9000-9100 3DMark2001's with default 325/275 clocks. Overclocking to 366/325 will almost get you in the 11000 ballpark. I guess I was running an Athlon XP at (overclocked) 2.4GHz or 2.53GHz at the time so CPU wasn't really a bottleneck.

thandor.net - hardware
And the rest of us would be carousing the aisles, stuffing baloney.

Reply 2 of 9, by Mondodimotori

User metadata
Rank Member
Rank
Member
Thandor wrote on 2025-10-11, 16:34:

Usually GPU-z and the nVidia drivers (run Coolbits for the overclocking tab) report the correct information with these cards.I think the original GeForce FX 5600's are fairly close to each other in terms of clocks. My XFX GeForce FX 5600 runs at 325/275 but several brands indeed use 325/250 as you mention. Of course there are exceptions like the Albatron GeForce FX5600EP that utilises a ghastly slow 64-bit memory bus.

Once you obtain one try benchmarking it yourself and compare results. Framerates don't lie 😉 (ahum, if I remember correctly the 44.03 driver that came on my driver CD of my XFX card contained "optimalizations" that increased the framerates of the 3DMark2001's Nature Demo... good times).

When you dive into the GeForce FX5600XT it's going to be a mess. These cards came later, became budget (regular 5600 was main-stream at launch) and became ugly as far as differences in clock frequencies and bus bandwidth's! Especially the 64-bit versions are terrible.

Oh no no, it's not mine. I still need to get one. I was talking about pictures from listings on ebay and the likes. I can post pictures of those...

This is a gainward card:

Spoiler
The attachment s-l16500.jpg is no longer available
The attachment s-l1600.jpg is no longer available

This is from an Asus V9560

Spoiler
The attachment s-l16d00.jpg is no longer available

FXqVHH3.jpeg

Another Asus V9560, same model, same cooler, same PCB, on another listing reports 250,7mhz on both ram and core clock.

This is an MSI one

Spoiler
The attachment s-l16sd00.png is no longer available
The attachment s-l1sds600.jpg is no longer available

I mostly look only at these listings because I can confirm it's a 128bit card.

Reply 3 of 9, by Mondodimotori

User metadata
Rank Member
Rank
Member
Thandor wrote on 2025-10-11, 16:34:

[edit] I've found an old post of myself mentioning I've got about 9000-9100 3DMark2001's with default 325/275 clocks. Overclocking to 366/325 will almost get you in the 11000 ballpark. I guess I was running an Athlon XP at (overclocked) 2.4GHz or 2.53GHz at the time so CPU wasn't really a bottleneck.

Sorry, I didn't noticed you had edited your post, so I still provided pictures from the listings.
What was bothering me was that they claims to be stock FX5600, yet they report inaccurate readings in the screenshoots.
Maybe this card had variable clock rates on both the GPU and memory, meaning it will only peak at 325/250 during rendering workloads?

I actually am not interested in the XT version, both because it's slower and it won't run on driver 45.23, meaning I would loose compatibility with some older games I wanted to play on a 9x system.

Reply 4 of 9, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Depending on VRAM chips used, getting 250MHz (500 effective) out of it, may not be possible (25% memory OC).
If card has -5 memory , you can safely say it's not a 5600 but 5600 XT - Gainward card.

It may not have "XT" in the name because Core is clocked higher, BUT you will never reach usual 5600s performance on it.
In short : NV didn't put enough safeguards on specs to prevent this.
FX 5600 with 250/200 ("true XT") :

The attachment 3DMark 01SE.PNG is no longer available

FX 5600 with 325/200 :

The attachment 3DMark 01SE.PNG is no longer available

Reply 5 of 9, by Mondodimotori

User metadata
Rank Member
Rank
Member
agent_x007 wrote on 2025-10-12, 12:27:

Depending on VRAM chips used, getting 250MHz (500 effective) out of it, may not be possible (25% memory OC).
If card has -5 memory , you can safely say it's not a 5600 but 5600 XT - Gainward card.

What do you mean by "-5 memory"?
Also yeah, this is what I feared, that some cards would go by the name of 5600, but they are 5600XTs in disguise. And since I plan to use it with 45.23 driver, I have to avoid XTs models.
I know I could avoid the whole ordeal by just getting a TI4200s, but those are soaring in prices, and the one I alredy have is dying.

Reply 6 of 9, by Thandor

User metadata
Rank Member
Rank
Member
Mondodimotori wrote on 2025-10-11, 22:01:
Sorry, I didn't noticed you had edited your post, so I still provided pictures from the listings. What was bothering me was tha […]
Show full quote
Thandor wrote on 2025-10-11, 16:34:

[edit] I've found an old post of myself mentioning I've got about 9000-9100 3DMark2001's with default 325/275 clocks. Overclocking to 366/325 will almost get you in the 11000 ballpark. I guess I was running an Athlon XP at (overclocked) 2.4GHz or 2.53GHz at the time so CPU wasn't really a bottleneck.

Sorry, I didn't noticed you had edited your post, so I still provided pictures from the listings.
What was bothering me was that they claims to be stock FX5600, yet they report inaccurate readings in the screenshoots.
Maybe this card had variable clock rates on both the GPU and memory, meaning it will only peak at 325/250 during rendering workloads?

I actually am not interested in the XT version, both because it's slower and it won't run on driver 45.23, meaning I would loose compatibility with some older games I wanted to play on a 9x system.

The ASUS card you mentioned seems fine. The GeForce FX runs at a lower clock in 2D mode. Perhaps sometimes the 2D clocks are shown.

Mondodimotori wrote on 2025-10-12, 15:15:
What do you mean by "-5 memory"? Also yeah, this is what I feared, that some cards would go by the name of 5600, but they are 56 […]
Show full quote
agent_x007 wrote on 2025-10-12, 12:27:

Depending on VRAM chips used, getting 250MHz (500 effective) out of it, may not be possible (25% memory OC).
If card has -5 memory , you can safely say it's not a 5600 but 5600 XT - Gainward card.

What do you mean by "-5 memory"?
Also yeah, this is what I feared, that some cards would go by the name of 5600, but they are 5600XTs in disguise. And since I plan to use it with 45.23 driver, I have to avoid XTs models.
I know I could avoid the whole ordeal by just getting a TI4200s, but those are soaring in prices, and the one I alredy have is dying.

The RAM chips are rated for a certain clock frequency . 5NS means it’s rated for 200MHz (1000/5). If you see cards with -5NS or -TC50 they are rated for 200MHz maximum clock (anything above means overclocking). If you have a card with 3.6NS chips you can run 277MHz (and thus 275MHz safely).

thandor.net - hardware
And the rest of us would be carousing the aisles, stuffing baloney.

Reply 7 of 9, by Mondodimotori

User metadata
Rank Member
Rank
Member
Thandor wrote on 2025-10-12, 17:13:

The ASUS card you mentioned seems fine. The GeForce FX runs at a lower clock in 2D mode. Perhaps sometimes the 2D clocks are shown.

That's also something I thought, since the memory clock is in line with nvidia stock clocks, and just the GPU clock is lower.

Thandor wrote on 2025-10-12, 17:13:

The RAM chips are rated for a certain clock frequency . 5NS means it’s rated for 200MHz (1000/5). If you see cards with -5NS or -TC50 they are rated for 200MHz maximum clock (anything above means overclocking). If you have a card with 3.6NS chips you can run 277MHz (and thus 275MHz safely).

Ok, so I just need to make the chip from the pictures and look for its specs on the internet, got it.

I think I may get the Asus card at this point, since it also has polymer capacitors, they should be more reliable over a long time than electrolitic ones.

Reply 8 of 9, by Thandor

User metadata
Rank Member
Rank
Member
Mondodimotori wrote on 2025-10-12, 17:25:
Thandor wrote on 2025-10-12, 17:13:

The ASUS card you mentioned seems fine. The GeForce FX runs at a lower clock in 2D mode. Perhaps sometimes the 2D clocks are shown.

That's also something I thought, since the memory clock is in line with nvidia stock clocks, and just the GPU clock is lower.

I think the 270MHz is the 2D clock frequency. It should increase to 325MHz when running 3D-applications.

thandor.net - hardware
And the rest of us would be carousing the aisles, stuffing baloney.

Reply 9 of 9, by Mondodimotori

User metadata
Rank Member
Rank
Member
Thandor wrote on 2025-10-12, 17:48:

I think the 270MHz is the 2D clock frequency. It should increase to 325MHz when running 3D-applications.

Yup, I've read somewhere, in some period reviews, that this card has variable clocks. I was jsut searching some more confirmation, since those reviews are 20+ years old.
I Guess I'll try and get one of those ASUS with polimer capacitors..