Reply 60 of 70, by mockingbird
The numbers speak for themself.
Core2 E5800 @ 3.2Ghz
2GB DDR @ 400Mhz Dual Channel
Driver 45.23 for nVidia
Driver 10.2 for ATI
-Quake 3 version 1.32c
-All settings maxxed
-For 16-bit tests, color depth and texture quality were changed to 16-bit
-Test run 3 times
Radeon 9700 (Dell OEM)
GeForce FX5800 (Quadro 1000 modified with Rivatuner bootstrap method and clocked to 400/800 with Coolbits)
Scale these numbers down to the early 2000s, with sub-Gigahertz machines -- why the heck should someone choose to pay the premium for the ATI product just for DX9 support? The whole selling point was HL2, which had its source code hacked and released way past launch date anyhow. You could get a LOT more longevity out of your system with an FX card.
The Radeon 9700 has a weak VRM compared to the 5800. The 5800 has good quality, expensive polymer caps that are still good. The 9700 had Nichicon HC series (decent, but not adequate for a modern VGA VRM) which I already replaced years ago before storing the card.
Even if the 9700 was on-par performance-wise with the 5800, look at that massive performance gain achieved by going to 16-bit. And the 9700 had weird graphical glitches in Quake3 at 16-bit, though the benchmark did complete without issue.
Now one of you mavens please explain to me again how the FX generation of cards was a disappointment, and how great ATI was with their "revolutionary" 9700 series.