I will compare my new DirectX 9.0 Box to the following article.
Tom's Hardware 3DMark06 article 'Best Of The Best: High-End Graphics Card Roundup' May, 2009
http://www.tomshardware.com/reviews/radeon-ge … up,2297-16.html
Tom's stock Nvidia GTX 285
I did not see which driver version they were using.
1280x1024, Default Quality
3DMark06 v1.1.0 SM2.0 Score 8397
3DMark06 v1.1.0 HDR/SM3.0 Score 8964
3DMark06 v1.1.0 3DMark Score 20638
CPU Score between 6317 and 6680, they were using an i7-920 overclocked to 3.8-GHz.
My System stock Nvidia GTX 285
GeForce 196.21 drivers.
1280x1024, Default Quality
3DMark06 v1.2.0 SM2.0 Score 7068
3DMark06 v1.2.0 HDR/SM3.0 Score 7988
3DMark06 v1.2.0 3DMark Score 15014
CPU Score 2799 Celeron G1610 @ 2.6-GHz (stock speed)
Thoughts
Tom used 3DMark06 1.1.0 and I used 1.2.0 from 3DMark's website, hopefully they produce almost identical scores.
You can clearly see all I am missing is some CPU power to push this thing to the max.
Going from a Celeron G1610 @ 2.6-GHz to an i5-3570k @ 3.4-GHz will give me an instant ~30% increase in CPU power. That doesn't even take into account the extra cache, 3.8-GHz Turbo, or the extra 2 cores. The i5-3570k CPU seems like the one to get if I am going to spend any money on a different CPU as it smokes the other 1155 CPU's and it doesn't cost much more on the used market. It seems there are some games that can make use of a quad core and if I ever decide to do something else with this box I will not be looking to upgrade the CPU yet again.
I am going to install Nvidia GeForce 296.10 drivers, reboot, run another test and post those scores as well just to see if there is a difference.
Gateway 2000 Case and 200-Watt PSU
Intel SE440BX-2 Motherboard
Intel Pentium III 450 CPU
Micron 384MB SDRAM (3x128)
Compaq Voodoo3 3500 TV Graphics Card
Turtle Beach Santa Cruz Sound Card
Western Digital 7200-RPM, 8MB-Cache, 160GB Hard Drive
Windows 98 SE