Pippy P. Poopypants wrote:Tetrium wrote:
Does that include the 5600's?
Wouldn't they be roughly equal to the midrange GF4's?
If this was back in 2003/04, I'd take a GF4 (Ti) over those in a heartbeat. They may have been competitive with the GF4 in DX8/OpenGL games but for the FX's original intended purpose (i.e. DX9 games), it sucked.
One thing to mention though, is that it takes ATI to trim their Radeon 9600 down to a 64-bit bus (Radeon 9600SE) to have performance equivalent to a 128-bit FX 5200 🤣
Now, the bold part is what really matters to me.
Personally I don't really look at what it was intended to be, but more "what can it do right now?". If it's equally good to the GF4, then to me they are equal (when it comes to their practical use when I put one in a rig of mine).
Though, iirc I don't own a single FX card at this time...at least I think I dont 😜
Edit:I agree about the FX5200's though, they seem to have very slim practical use these days.
Whats special about the 5200's is that they sucked then, and still suck today!
So different to those S3 Virge graphics decelerators back in the day...they used to suck back then, but these days they are of more use because of their good compatibility with older rigs (think 486).
Back in the day, the Voodoo 5 was reviewed as being too little too late, the GF2's were simply the better deal...but look at it from todays point of view, it's clear which one of the 2 is getting more attention!
Even the GF4MX's (which are really scaled down GF2's) are of more use, because of better backward compatibility compared to the 5200's (correct me if I'm wrong).
And some cards used to be great when they were new, and are still great today.