VOGONS


Geforce FX Thread

Topic actions

Reply 20 of 259, by bushwack

User metadata
Rank Oldbie
Rank
Oldbie
Tetrium wrote:

Oh yes, I kinda remember this 😜

And anyone remember this?
http://www.youtube.com/watch?v=WOVjZqC1AE4

No, it's a wonder but it all seems very true. 🤣

I had a Geforce 5900 XT 128mb back when they were released, good bang for the buck, but it was near the end of the FX line. It was relatively quiet and proved a good bit more powerful then the Radeon 8500 it replaced.

Reply 22 of 259, by F2bnp

User metadata
Rank l33t
Rank
l33t

Hahahaha I just remembered when I first tried to run Oblivion on that thing. The system requirements supposedly supported the card, but damn even on the then lowest possible settings, Oblivion was simply unplayable. We're talking about 5 fps here! Thank god for OldBlivion, it served me well until Bethesda decided to release a patch which unlocked a new graphics detail mode, Ultra Low, and allowed the game to run nicely, yet really really ugly. I got a 7600gs not much later, so that was the end of that.
Still, the FX series wasn't pathetic, it just wasn't anywhere near as competitive as Ati Radeon 9xxx. I seem to remeber that they rocked for everything except DX9. So everything that used Shader Model 2 was really crap on GeforceFX.

Reply 23 of 259, by leileilol

User metadata
Rank l33t++
Rank
l33t++

I thought GeforceFX sucked on non-SM2/DX9 too. I recall the Geforce2 GTS stomps it at 3DMark and Quake3...

In some extreme slow situations, things only start to get playable at 320x240 X_X I had Prey going at 60fps+ at that res together with a cvar that made it use the older unspported "NV20 path". This was on a 5200, NON-ULTRA. 🤣

apsosig.png
long live PCem

Reply 24 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

5800 Ultra was better than 9700 Pro in most OpenGL games (Quake 3 based of course). And there are some games like KOTOR and NWN that were essentially built for NVIDIA OpenGL and run relatively poorly on ATI hardware.

DirectX 8 games run pretty well but I think 9700 Pro pulled ahead usually here. DirectX 7 stuff would have all run fast as hell anyway.

Good reviews
http://www.anandtech.com/show/1062/6
http://techreport.com/articles.x/4966/7

Doom3 runs surprisingly well on a FX 5900 too. Really it runs exceptionally well. NVIDIA had a edge on a few things with Doom3 including the shadow rendering and fillrate optimization techniques. ATI's Hierarchical Z breaks in Doom3. This is probably the best-running advanced game for a FX card.

Numbers:
http://techreport.com/articles.x/7153/3

Last edited by swaaye on 2011-02-10, 00:21. Edited 1 time in total.

Reply 26 of 259, by RogueTrip2012

User metadata
Rank Oldbie
Rank
Oldbie

Wow, no love for the 5700 Ultra 128Mb. I had a EVGA one that I eventually sold to my boss which he still uses (he's not a gamer). Also sold him the motherboard and Prescott too! 🤣

I remember that I built that with a Pentium 4 3.0 w/HT Prescott and a stick of 512MB. It sucked. As others had to bust HL2 down to DX8 rendering for decent performance. Need For Speed Underground had hitching issues even when I upgraded 512MB to 1GB. Doom 3 performance was underwhelming and Quake 4 was barely playable at 800x600 and that was mainly due to the P4!

This makes me want to go play Underground now with my Phenom II X4 940BE now, 🤣. I always wondered if the hitching/stutter would go away. I'm sure even my Pentium 4 3.2GHz with 7800GT PCI-E would kick its butt.

> W98SE . P3 1.4S . 512MB . Q.FX3K . SB Live! . 64GB SSD
>WXP/W8.1 . AMD 960T . 8GB . GTX285 . SB X-Fi . 128GB SSD
> Win XI . i7 12700k . 32GB . GTX1070TI . 512GB NVME

Reply 27 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I see that with the 5900 sometimes too. It seems to have a somewhat unstable framerate. It will get choppy and then starts to hitch as you call it. It's as if it gets extra overwhelmed for a moment and then recovers. Oblivion really brings this out.

Reply 28 of 259, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

It was not so disastrous as sensationalists are trying to make it look. Some of the fail can be supposedly blamed at manufacturer UMC, which could't get their 130 nm process so good as promised. Original 5800 were at first laugh at because of noisy cooling, but that can be and was easily corrected. The ps 2.0 performance is awful partially because nvidia went with 32 bits precision instead of dx9 24 bits. But there were only several games in it's lifespan that stressed the shaders enough to get really obliterated by radeons. And you always had the option to use lower ps profile. With older ps profiles and without AA, the FX line was usually faster then Ati's.
The tweaked chips of 5700 and 5900 had half precision units replaced with full precision, doubling the 24/32 bit power per clock. Cheap XT variants come out and become very popular for their overclockability.
Doom runs great because FX chips have double the Z-rate of radeons. Also id dropped any ati optimizations because of alpha leak when R300 was alone in the field. Later were implemented and R3x0 got significant boosts. HL2 story of dropping partial precision support is notorious.
So I say you could have had a good game with FX cards. Anyway the graphics drama of 2004 may never be matched.

Reply 29 of 259, by keropi

User metadata
Rank l33t++
Rank
l33t++

Had a 256MB 5600 card back then... worked fine for me 😜 , my next upgrade was a 8800GTX , so as you can see I kept it for some time 🤣

🎵 🎧 PCMIDI MPU , OrpheusII , Action Rewind , Megacard and 🎶GoldLib soundcard website

Reply 30 of 259, by F2bnp

User metadata
Rank l33t
Rank
l33t
Putas wrote:
It was not so disastrous as sensationalists are trying to make it look. Some of the fail can be supposedly blamed at manufacture […]
Show full quote

It was not so disastrous as sensationalists are trying to make it look. Some of the fail can be supposedly blamed at manufacturer UMC, which could't get their 130 nm process so good as promised. Original 5800 were at first laugh at because of noisy cooling, but that can be and was easily corrected. The ps 2.0 performance is awful partially because nvidia went with 32 bits precision instead of dx9 24 bits. But there were only several games in it's lifespan that stressed the shaders enough to get really obliterated by radeons. And you always had the option to use lower ps profile. With older ps profiles and without AA, the FX line was usually faster then Ati's.
The tweaked chips of 5700 and 5900 had half precision units replaced with full precision, doubling the 24/32 bit power per clock. Cheap XT variants come out and become very popular for their overclockability.
Doom runs great because FX chips have double the Z-rate of radeons. Also id dropped any ati optimizations because of alpha leak when R300 was alone in the field. Later were implemented and R3x0 got significant boosts. HL2 story of dropping partial precision support is notorious.
So I say you could have had a good game with FX cards. Anyway the graphics drama of 2004 may never be matched.

Point is, ATi looked better and run faster at the same settings.
But these things always happen, ATi has also released at least one series of graphics cards which weren't up to it. It's actually a cycle, one company holds the top and after a year or so they drop and the other company holds the top and then vice versa.

Reply 31 of 259, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Wasn't there an issue with the memory? ATI using DDR and Nvidia DDR2 or something like that?

Or was it regarding the width of the memory bus? I can't remember the details...

And of course we have to mention the fan. AKA the DUSTBUSTER!

Reply 33 of 259, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++
F2bnp wrote:

I think the Ati 4850 is louder actually

Not even close! I had a 4850 and later a 5770 and these cards are silent compared to the Geforce FX.

Nvidia even made a comedic video about the dustbuster, very entertaining. There are also Videos on YouTube giving you an idea just how loud that cooler was.

Reply 34 of 259, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Mau1wurf1977 wrote:

There are also Videos on YouTube giving you an idea just how loud that cooler was.

Yes...just terrible!

When the video started I was like "Omg that cooler is LOUD!"...and THEN the GF FX cooler kicked in 🤣!

Reply 35 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:

With older ps profiles and without AA, the FX line was usually faster then Ati's.

Doom runs great because FX chips have double the Z-rate of radeons. Also id dropped any ati optimizations because of alpha leak when R300 was alone in the field. Later were implemented and R3x0 got significant boosts.

The 5800U tended to either match or lose to 9700 Pro in everything except OpenGL games.
http://techreport.com/articles.x/4966/8

Doom3 is probably the game best tailored to FX5900 and 5700. I don't know how well the 5800 and 5600 do though because they are considerably different architecturally. ATI actually had some handicaps in that game because it broke their hierarchical Z optimizer. ATI R300/R400 do have double Z but you must have 2X MSAA enabled.

I'm not sure how much complex pixel shading the game does because no matter what you do for optimizations a FX card can not match R300 when it comes to floating point pixel shading.
http://alt.3dcenter.org/artikel/cinefx/index_e.php

Also, initial tests of Doom3 on R300/R400 will turn up slower results than later ones because there were a number of optimizations that ATI did to their drivers over time.

Reply 36 of 259, by Iris030380

User metadata
Rank Member
Rank
Member

I think they got a hard time. The FX5800 and the Fx5800 Ultra were noisy beasts, but they FLEW in DirectX 8 and openGL games. The FX5600 Ultra was on a par with a GF4Ti 4600 in DX8, with the added bonus of being a cheap mid range card and it could stagger through 3Dmark 2003 half respectfully. The 5200 was a bad joke as far as games went, but it was a low end card ... ok I can't defend it the 5200 was SHITE.

Of course, in the end the 5950 Ultra later revisions were amazing - fastest of it's generation but way too late in arrival. What killed the FX series (apart from the 5200) was ATI's cards. Specifically the early 9500 and the 9700 pro. They punished nvidia without mercy. But let's be honest - nvidia had it pretty good up until then from the GF2 ultra, GF3 and GF4 Ti series with ATI limping along crying all the time.

Having said all that, I have a 5950Ultra and wouldn't swap it for 9800XT even if you threw in a bag of Skittles, 2 conkers and a KitKat.

I just avoid directX 9 games like the plague.

Reply 37 of 259, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++
Iris030380 wrote:

I just avoid directX 9 games like the plague.

To be fair, many DX9 games struggled on the Radeon 9800 as well. E.g. Doom3 or Farcry.

These games really needed a 6800GT or X800

I miss these days where the next generation had double the shaders, not like yet where the improvements are quite minimal.

Reply 38 of 259, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Far Cry didn't struggle on R9800 for me. The 6800 hadn't even come out yet when Far Cry released 😀

Doom3's a OpenGL game, and at a minimum actually requires DirectX 6... it doesn't use GLSL which is why the FX can run it without too much of a fault. Introduce OpenGL 2.0 to the FX series and they just enter beyond turtle mode.

apsosig.png
long live PCem

Reply 39 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I tried Oblivion, HL2 DX9 and FEAR on my overclocked 5900. HL2 takes something like a 60% framerate hit over DX8. FEAR and Oblivion are hardly playable at 640x480 with medium details. And that card was meant to compete with 9800Pro/XT, which completely wipes the floor with it. We're talking several times faster here.

Even FarCry is pretty nasty on the 5900 and the game I think uses a mixed PS 1.x/2.0 mode with those cards.

Last edited by swaaye on 2011-02-12, 01:53. Edited 1 time in total.