VOGONS


Best DX9 VGA period correct for a 940 system

Topic actions

Reply 20 of 27, by Mamba

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote:
Personally, if I were to build a system like this, it would be meant specifically to highlight the end of the DX9 era and the fi […]
Show full quote

Personally, if I were to build a system like this, it would be meant specifically to highlight the end of the DX9 era and the final days of AMD's supremacy over Intel (in other words, pre-Core 2 Duo).

In this case, I would stick with Pre-G80 graphics and the best CPU the board could handle (probably an Opteron 285 or 290?).

If you wanted something extremely weird and powerful but not necessarily the "best" cards from that era, you could get a pair of Geforce 7950GX2s (meaning, 4 way SLI with two cards). Technically, the 8800GTX was the best card from 2006, but its a DirectX 10 card and signified the start of the next era of GPUs, where as the 7900GTX and 7950GX2 signified the end of the DX9 era. The 79xx series were fantastic cards, they were just kind of lost to time since the 8800GTX came out later the same year (I actually had a 7900GTX, then a 7950GX THEN an 8800GTX all in 2006... lots of reselling). The 7950GX2 was extremely potent in games that could use it, but SLI was pretty iffy in those days... IMO, that makes it perfect for a "period" build, since the point is obviously not to play the games at the highest settings possible, or you'd just play them on a newer system.

When I build systems, they have a very specific era in mind. You could build a really really uninteresting "2007" system that is 100% period correct that has no special significance, or you could build one that contains the best AMD had to offer before Intel reclaimed the crown for the next 10 years and has the most powerful Pre-DX10 GPUs available.

Tat was exactly what I had in mind when I found the Foxconn 940 motherboard.
And the 7950gx2 was definitely an option, but it draws lot of power and is really hot and loud.
So I stick with g92, that can be silenced easily.

I am having problems finding cheap GTS250, while I can obtain two GTS450 for 50€ both... 😵

Reply 21 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++

ATI has noticeably better texture filtering and anti-aliasing when we're talking pre-G80 NVIDIA. I'd look at X1950XTX in that case. Technically R580 is a much superior Shader Model 3 chip than G7x as well. It makes a difference in some more advanced games.

I'm not sure what to expect out of Crossfire though. I've never used it or SLI. But Crossfire probably has more problems. The R4xx and R5xx boards actually use FPGA hardware and external cabling for Crossfire .

Reply 22 of 27, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I remember seeing some benchmarks on newer games, run on the GF7 and ATI X1900 series. The ATI card did much better with the newer games. I found that very interesting. I also remember reading about the higher IQ (texture filtering).

YouTube, Facebook, Website

Reply 23 of 27, by Unknown_K

User metadata
Rank Oldbie
Rank
Oldbie

If you are going to bother with socket 940 and SLI then get a dual CPU motherboard.

Collector of old computers, hardware, and software

Reply 24 of 27, by candle_86

User metadata
Rank l33t
Rank
l33t
PhilsComputerLab wrote:

I remember seeing some benchmarks on newer games, run on the GF7 and ATI X1900 series. The ATI card did much better with the newer games. I found that very interesting. I also remember reading about the higher IQ (texture filtering).

Well yes the R500 would be faster in newer titles, 48 pixel shaders vs 24 pixel shaders

Reply 25 of 27, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Actually NV40/G7x's problem isn't raw throughput. It has abysmal performance with more advanced features of SM3. Dynamic branching for example. The architecture is really designed for SM2 and basic early testbed use of SM3. If you want to run SM3 fast on NVIDIA, go G80 or newer.

I'm not sure this really impacts the machine being built here though. The main issues I have with G7x are its texture filtering and anti-aliasing. In some games you will see shimmering / aliasing in the texture filtering. NVIDIA "optimized" pretty aggressively for more speed. The anti-aliasing is quite inferior because it's still not gamma corrected. ATI started doing that with R300. G80 overhauled all of this too.

Reply 26 of 27, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
Mamba wrote:

And still I am not convinced that two GTS450 are better than two G92 in DX9 games.

a few dx9 games vs 9800GTX and GTS 250

http://images.anandtech.com/graphs/gts450_091 … 05112/24688.png

http://tpucdn.com/reviews/Zotac/GeForce_GTS_4 … 4_1680_1050.gif

http://tpucdn.com/reviews/Zotac/GeForce_GTS_4 … a_1680_1050.gif

http://tpucdn.com/reviews/Zotac/GeForce_GTS_4 … 3_1680_1050.gif

very close but a little bit faster, even if it's to limited to conclude the 450 is faster on Dx9 I think it's enough to conclude that there is no massive difference, and when you try more demanding DX10+ games the GTS 450 shows a more significant advantage most of the time, if you look at the specs g92 might look stronger in some points, but the 450 is clearly a winner when it comes to shader and with a few years of architecture improvements and seems to get the job done against g92.

swaaye wrote:

Actually NV40/G7x's problem isn't raw throughput. It has abysmal performance with more advanced features of SM3. Dynamic branching for example. The architecture is really designed for SM2 and basic early testbed use of SM3. If you want to run SM3 fast on NVIDIA, go G80 or newer.

I'm not sure this really impacts the machine being built here though. The main issues I have with G7x are its texture filtering and anti-aliasing. In some games you will see shimmering / aliasing in the texture filtering. NVIDIA "optimized" pretty aggressively for more speed. The anti-aliasing is quite inferior because it's still not gamma corrected. ATI started doing that with R300. G80 overhauled all of this too.

yes the 7 series had more performance difficulties with later SM3 games compared to the X1k (specially X19k) series, and it also had no support for AA with some games with HDR on while the radeons had (I remember Oblivion making the problem obvious)...

Reply 27 of 27, by Mamba

User metadata
Rank Oldbie
Rank
Oldbie

It will be a XP64 system, so DX10 is a no go.
So I will search for two G92.