Reply 60 of 134, by mockingbird
RandomStranger wrote on 2021-09-03, 05:54:@mockingbird
https://youtu.be/uq-v1TTUyhM
@o'Doyle
https://www.youtube.com/watch?v=XVO3NJCPIoY
The numbers speak for themself.
System:
Core2 E5800 @ 3.2Ghz
2GB DDR @ 400Mhz Dual Channel
Driver 45.23 for nVidia
Driver 10.2 for ATI
Benchmark:
-Quake 3 version 1.32c
-DEMO001
-1280x1024 resolution
-All settings maxxed
-For 16-bit tests, color depth and texture quality were changed to 16-bit
-Test run 3 times
Radeon 9700 (Dell OEM)
32-bit
170.7
170.8
170.7
16-bit
173.1
173.3
173.3
GeForce FX5800 (Quadro 1000 modified with Rivatuner bootstrap method and clocked to 400/800 with Coolbits)
32-bit
245.4
245.4
245.4
16-bit
290.6
294.1
293.9
Scale these numbers down to the early 2000s, with sub-Gigahertz machines -- why the heck should someone choose to pay the premium for the ATI product just for DX9 support? The whole selling point was HL2, which had its source code hacked and released way past launch date anyhow. You could get a LOT more longevity out of your system with an FX card.
The Radeon 9700 has a weak VRM compared to the 5800. The 5800 has good quality, expensive polymer caps that are still good. The 9700 had Nichicon HC series (decent, but not adequate for a modern VGA VRM) which I already replaced years ago before storing the card.
Even if the 9700 was on-par performance-wise with the 5800, look at that massive performance gain achieved by going to 16-bit. And the 9700 had weird graphical glitches in Quake3 at 16-bit, though the benchmark did complete without issue.
Now one of you mavens please explain to me again how the FX generation of cards was a disappointment, and how great ATI was with their "revolutionary" 9700 series.