swaaye wrote:
The OpenGL.org forum had some people who really liked the 5200 as a cheap D3D9 development experimentation platform. An interesting angle on that chip.
For OS X too - the FX 5200 supports CoreImage and in some cases this means significant performance improvements over the GF4/Radeon 9000 that predated it. Other DX9-era cards were certainly also well regarded for this, but the FX 5200 was generally the cheapest and simplest option to provide CoreImage support for a Mac.
swaaye wrote:I recall the 5900XT being perceived as a pretty solid value at the time, probably the best for the whole FX lineup. It's just a 5900 Ultra with lower clock speeds.
Half the RAM (and at somewhat lower speeds) too - and many cards will make the jump from 390-400 to 450MHz quite gracefully (mine will do around 550 😲 ), but taking the RAM from 700->850 is usually not feasible. 5900XT was one of the last FX cards to be released as well; I remember reading theories back in '03-'04 that XT was potentially just to eat up unsold NV35 dies (especially when many of the cards could do ~450MHz). There's a "5900 Vanilla" that sits in-between those two which was a ~$250-$300 (release SRP) part as well, but I don't think they were very popular (based on that they're not exceedingly common on ebay these days, and I don't remember hearing much about them or seeing them in reviews all that often). If I remember right their clocks are much closer to the 5900 Ultra, at something like 400/800.
The games of the time were still mostly a mix of D3D7/8 and the FX cards are fine with these. It wasn't overly apparent for awhile that the FX cards were really terrible at PS2.0. Some developers did tailor their games for the FX cards too. Far Cry and Doom3 run pretty well.
I would add that the whole "Half-Life 2 Controversy" and Radeon 9800 vs GeForce FX 5900 blood feud seems to have gotten fiercer and more contentious as time has gone on - I don't honestly remember so much drama about this "back in the day" nor do I remember Half-Life 2 being such a singular focus for many people. FPS gaming is not the entire scope of gaming, nor is Half-Life 2 the entire scope of FPS. Until GeForce FX is mentioned, and then it's just a non-stop barrage of how badly NV30 (actually usually NV35 - I have never seen a published online review of NV30 itself in Half-Life 2) does in Half-Life 2 and why GeForce FX is the worst thing since the draft and human sacrifice.
If you liked Bioware OpenGL games, you didn't really want to be on ATI back then. That's another thing to consider when it comes to perceptions here. Getting KOTOR or NWN working perfectly on ATI was a trick.
NWN1 requires/uses palletized textures (I say slash because I've read some reports that the Diamond edition removed this requirement, but I'm not certain of that), which R300 doesn't support. I'm not sure about KOTOR. Catalyst 3.x were largely not great drivers to live with for other reasons too - fog table is broken/not supported, multi-monitor is fairly limited, iirc there's issues with MPEG-decode acceleration (it isn't as plug-in-perfect as PureVideo from what I remember), etc. Later releases, like Catalyst 8.x, certainly work a lot better - but those weren't available for many years, and for Windows 9x some fixes never came (e.g. fog table). From a more "Vogons-centric" perspective, the GeForce FX (and GeForce 4) have an edge over the Radeon 9 due to drivers and hardware compatibility (e.g. all are universal AGP cards), and the DX9 performance question is largely not an issue, because like people in 2005 we too can get GeForce 7800GTX (or similar), or we can move up to something faster altogether (e.g. the last time I played Half-Life 2 it was on a GeForce GTX 660, at 1080p with HDR and full max settings, and it ran around 500 FPS).
swaaye wrote:
Far Cry has lots of tech in it and was patched a bunch of times too. I think it has NV30 and R4x0 optimizations. SM3.0 HDR was added. But with NV3x I think it mostly runs PS1.4. The console log reads it out.
There is an SM2.0b (R4xx) path that implements HDR and some other "SM3.0 features" for the X800/X850. I know there's a review floating around out there that has image quality comparisons between X850 and 6800/7800 with 2.0b and 2.0c - from what I recall the differences, at least to the naked eye, are very minor. I'm not sure about a DX9.0a path (that'd be NV3x's "beyond DX9" features ++ optimizations). AFAIK FarCry is one of the few games that actually bothered to implement a working 9.0b HDR path (working as in, afaik, it performs pretty well, at least for X800/850 - I'm not sure if it could be forced to work on something like 9800XT or not though (other X800 features have been at least), or how it'd work on something like the X600). And while all of that was neat - the constant non-cumulative patching for FarCry was decidedly not neat... 🤣 😵