VOGONS


Geforce FX Thread

Topic actions

Reply 160 of 259, by Pippy P. Poopypants

User metadata
Rank Member
Rank
Member

I own a card based on almost every GeForce chip, except the FX (the missing link in my collection). Our old CAD machines back in high school had FX 5200s (before we upgraded to them, we were using Permedia 3's and Rage 128s). They served their purpose okay for running 3ds max, but during our lunch breaks whenever we played games they were slow as all hell, and as usual, HL2 defaults to the DirectX 8.1 rendering mode for these things. Funny how I almost bought one of these because I thought it would be an "upgrade" for my Ti 4200, until I did some reading around. They were cheap for a reason after all

GUIs and reviews of other random stuff

Вфхуи ZoPиЕ m
СФИР Et. SEPOHЖ
Chebzon фt Ymeztoix © 1959 zem

Reply 161 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The FX 59x0 cards are the only ones even worth consideration but even they have a hard time against a lowly Radeon 9600 in some games.

Let's not even ponder the inefficiency there 🤣

Reply 162 of 259, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

The FX 59x0 cards are the only ones even worth consideration but even they have a hard time against a lowly Radeon 9600 in some games.

Let's not even ponder the inefficiency there 🤣

Does that include the 5600's?
Wouldn't they be roughly equal to the midrange GF4's?

Those cards aren't the ones I'm willing to pay big bugs for anyway, but if I can get one for under €5, I'll consider buying it! 🤣 (except for the Radeon 9600's, those are nice though I already own 2 or 3 😀 )

Also, I've never really been a fan of the high end cards myself. Surely their performance is relatively good, but I'll rather stick to a midrange card thats 2 gens newer 😀

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 163 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The mid FX cards are so disappointing..... Useless for any DX9. Only the 5900 can remotely handle.DX9 well. But yeah the others do OK in DX8 and GL.

And ya of course the 6 and later series are clearly superior.

Reply 164 of 259, by Pippy P. Poopypants

User metadata
Rank Member
Rank
Member
Tetrium wrote:

Let's not even ponder the inefficiency there 🤣
Does that include the 5600's?
Wouldn't they be roughly equal to the midrange GF4's?

If this was back in 2003/04, I'd take a GF4 (Ti) over those in a heartbeat. They may have been competitive with the GF4 in DX8/OpenGL games but for the FX's original intended purpose (i.e. DX9 games), it sucked.

One thing to mention though, is that it takes ATI to trim their Radeon 9600 down to a 64-bit bus (Radeon 9600SE) to have performance equivalent to a 128-bit FX 5200 🤣

Reply 165 of 259, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Pippy P. Poopypants wrote:
Tetrium wrote:

Does that include the 5600's?
Wouldn't they be roughly equal to the midrange GF4's?

If this was back in 2003/04, I'd take a GF4 (Ti) over those in a heartbeat. They may have been competitive with the GF4 in DX8/OpenGL games but for the FX's original intended purpose (i.e. DX9 games), it sucked.

One thing to mention though, is that it takes ATI to trim their Radeon 9600 down to a 64-bit bus (Radeon 9600SE) to have performance equivalent to a 128-bit FX 5200 🤣

Now, the bold part is what really matters to me.
Personally I don't really look at what it was intended to be, but more "what can it do right now?". If it's equally good to the GF4, then to me they are equal (when it comes to their practical use when I put one in a rig of mine).

Though, iirc I don't own a single FX card at this time...at least I think I dont 😜

Edit:I agree about the FX5200's though, they seem to have very slim practical use these days.
Whats special about the 5200's is that they sucked then, and still suck today!
So different to those S3 Virge graphics decelerators back in the day...they used to suck back then, but these days they are of more use because of their good compatibility with older rigs (think 486).

Back in the day, the Voodoo 5 was reviewed as being too little too late, the GF2's were simply the better deal...but look at it from todays point of view, it's clear which one of the 2 is getting more attention!
Even the GF4MX's (which are really scaled down GF2's) are of more use, because of better backward compatibility compared to the 5200's (correct me if I'm wrong).

And some cards used to be great when they were new, and are still great today.

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 166 of 259, by sgt76

User metadata
Rank Oldbie
Rank
Oldbie

The GF4 is a godly card! My 4600Ti can handle Far Cry, HL2, NFS: MW at 1024x768, DX8 of course, nearly as fast as my 6800GS (in DX9). You lose some features, but its not like the games look like shit in DX8 or anything.

It was the G92 of it's day.

Reply 167 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I guess when it comes to DX8/GL and GF4 vs 5 the question is which has the better texture filtering (aniso) and anti aliasing. My guess is that GF5 should be a bit better assuming you use drivers that aren't too cheat laden.

I'm not sure which driver revisions are best. Probably not any from during the R350 vs NV35 times though! Maybe later or earlier drivers. I know from experience that newer drivers fix DX9 bugs because I've seen it firsthand in Guild Wars.

In fact GW is one game where you can see FX 5900 rock a 9600 because of all its bandwidth and fillrate. The 5900 can run 4x AA better than the 9600 as long as you aren't loading it down with too many shader effects. Unfortunately ATI's AA is superior due to gamma correction and the ability to run up to 6X. 😀

Reply 168 of 259, by Tetrium

User metadata
Rank l33t++
Rank
l33t++

Lol, I can tell from experience GW runs perfectly fine on a Radeon 9600, even dual accounting was possible 😜

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 169 of 259, by unmei220

User metadata
Rank Member
Rank
Member

Just a quick question for those more familiar with the FX series...
I have a Gainward 5900XT (Ultra 1100XP) and also an ASUS 5700 (V9570) both with 128MB. The 5900XT runs at 450mhz core, 780mhz memory. The 5700 runs at 425mhz core, 500mhz memory.
I tested both in the same machine: a Pentium 4 2400 Mhz Northwood, running in a MSI 661FM-L (SiS 661FX chipset) with 512 MB RAM, under WinXP. Clean install.
The "problem" is, both cards performs relatively the same. Under 3dmark01, the 5900xt scores 10743 points, the 5700 scores 9445. According to reviews from those times, the 5900XT should be getting +17k points, and the 5700 should be getting +15k. Both tested using 53.03 drivers.
Is the P4 bottlenecking the cards or are those numbers normal ?

More details (all results are fps):
Car chase low:
5900XT - 147.4
5700 - 157.4

Car chase high:
5900XT - 46.6
5700 - 47.9

Dragothic low:
5900XT - 210
5700 - 152.8

Dragothic high:
5900XT - 116
5700 - 94

Lobby low:
5900XT - 138.7
5700 - 136

Lobby high:
5900XT - 58.3
5700 - 60

Nature:
5900XT - 68.2
5700 - 47.4

Reply 171 of 259, by unmei220

User metadata
Rank Member
Rank
Member

Thanks for the reply. Any suggestion for other benchmark that isn't too CPU sensitive ? Because that's the fatest CPU I have right now that works in an AGP mobo. I just want to make sure the 5900XT works ok. It doesn't seem right to me for it to be so close in performance to the 5700. There should be a longer difference between them, right ?

Reply 172 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

3Dmark03 is less CPU dependent, particularly with a FX card. 🤣

The two cards are probably going to perform fairly close in practice. The 5900XT has considerably more bandwidth but its fillrate is similar. If you enable 4X MSAA the 5900 may distance itself. Bandwidth doesn't help with the FX series' shader performance though.

Didn't people used to flash the XT with 5900 Ultra BIOSs? That might be pushing it but a plain 5900 BIOS seems likely to work. The only difference between the XT and the regular 5900 appears to be clock speed.

Reply 173 of 259, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

3Dmark03 is less CPU dependent, particularly with a FX card. 🤣

The two cards are probably going to perform fairly close in practice. The 5900XT has considerably more bandwidth but its fillrate is similar. If you enable 4X MSAA the 5900 may distance itself. Bandwidth doesn't help with the FX series' shader performance though.

Didn't people used to flash the XT with 5900 Ultra BIOSs? That might be pushing it but a plain 5900 BIOS seems likely to work. The only difference between the XT and the regular 5900 appears to be clock speed.

http://www.nforcershq.com/geforce-fx5900-to-f … ltra-flash-mod/

Reply 174 of 259, by unmei220

User metadata
Rank Member
Rank
Member

I had that in mind, but desisted in order to not ruin my card, because I don't know which BIOS would be suitable for it. I think the OC the card already has makes it close to the 5950 speeds......

Reply 176 of 259, by sliderider

User metadata
Rank l33t++
Rank
l33t++
sgt76 wrote:

The GF4 is a godly card! My 4600Ti can handle Far Cry, HL2, NFS: MW at 1024x768, DX8 of course, nearly as fast as my 6800GS (in DX9). You lose some features, but its not like the games look like shit in DX8 or anything.

It was the G92 of it's day.

I still remember the days when the GeForce 4 was released and the 4600/4800Ti models were the cards you simply HAD to have for serious gaming until the Radeon 9700 came along and kicked it to the curb. The 4Ti's were still better for DX8 games than the low to mid range FX cards up to about the FX5700LE.

Reply 177 of 259, by jmrydholm

User metadata
Rank Member
Rank
Member

I have a low-end 5200 FX in my XP tower, and a 5900 FX Ultra sandwiched between two massive heatsinks w/pipes. The original fan died, so I installed that back in 2005 and overclocked it. I have the 5900 on standby as I need room in the XP tower for two Voodoo 2's running in SLI mode. (The heatsink is just too darn big) I use the 5200 for basic 2d stuff.

"The height of strategy, is to attack your opponent’s strategy” -Sun Tzu
“Make your fighting stance, your everyday stance and make your everyday stance, your fighting stance.” - Musashi
SET BLASTER = A220 I5 D1 T3 P330 E620 OMG WTF BBQ

Reply 178 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:

until the Radeon 9700 came along and kicked it to the curb.

That's how it worked in D3D games, but ATI was well behind in OpenGL until around R600.

Just about anything that uses OpenGL runs better on NV for various reasons. It's partly developer laziness and partly ATI laziness. Their OpenGL driver was a mess until sometime during the X1000 era when it was rewritten from scratch.

Doom3, KOTOR, NWN, etc are all better played on a GF4 than even a Radeon 9700. Radeon 1 - 8500 were awful in OpenGL games and not so hot in D3D for that matter either 🤣

Apparently Radeon 8500 is less capable than a GF3 in some cases in OpenGL and this is why KOTOR has problems. You can see something's amiss by how it only supports OpenGL 1.3 whereas GF3/4 are OpenGL 1.4.

Reply 179 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I busted out the 6600GT that I got for free the other day and did some comparisons between it and my 5900 Ultra. 😉

I only tried out Doom3 and KOTOR but the two cards really felt rather comparable on my Athlon 64 2.2 GHz. The 6600GT has a major bandwidth deficit though (less than half in this card's case) and this should show up with MSAA.

Obviously if you play anything that uses PS 2.0 to any extent then the 6600GT will just dust the 5900. I giggle at the memories of the 5900 Ultra trying to run Oblivion at 640x480 and only managing 10-15 fps.