VOGONS


First post, by Private_Ops

User metadata
Rank Member
Rank
Member

Back in the day (for me anyway), I had an Opteron 144, 1GB of RAM, Audigy 2ZS, and an AGP 6600GT. I played games such as COD2, Doom 3, and CS:S (and of course older games).

I've put together an "era" rig from around that time (Athlon 3500+, 1GB RAM, Audigy 2 (ZS. I'd have to double check). Currently on an SiS 761 board (apparently one of just a few models produced) which gives me PCI-e.

I could go with another 6600GT but, I want ATI. I was thinkin an X850 XT PE but, I saw the 2600XT (I like the idea of no 6-pin power plug. How do different 2600XT models (DDR3, GDDR3, GDDR4) compare to the 6600GT and also the 8600GT (I had one of these at a later date).

Reply 1 of 18, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

8600 GT (GDDR3) > HD 2600 XT (GDDR4/GDDR3) >>> 6600 GT
Radeon may be better in Win XP/DX9 with some titles, but in general 8600 GT is better.

DDR3 = GDDR3 (if frequency is the same).
GDDR4 is a GDDR3 with few tweaks, but in general it's not that much different (this is the reason why GDDR4 standard wasn't widely adopted).

157143230295.png

Reply 2 of 18, by swaaye

User metadata
Rank l33t++
Rank
l33t++

ATI's drivers improved over time. I think the 2600 XT is probably faster than the 8600 cards considering that. Early drivers for their D3D10 cards were really terrible.

When I was playing around with 2600XT and 2900XT years ago, I found that going beyond Catalyst 10.6 could cause bad stuttering on some games. That was on a nForce4 platform so it could have been specific to that platform. Who knows. The drivers probably stopped improving long before then anyway.

Reply 4 of 18, by shamino

User metadata
Rank l33t
Rank
l33t

When I compared an HD2600XT vs a 7600GS (AGP models), my impression was that the difference gets dramatic when you move to later shader heavy games. The examples I tried were Fallout 3 and Skyrim. With older games the difference was less. As for the games you mentioned, I have no idea.
For some reason, the 7600GS was significantly faster in Flatout. I think that's an aberration.
I think the design of the HD2600XT was heavily biased in favor of shader performance, which makes sense for the time when it came out.

Something I like about the HD2600XT is that it supports H.264 video acceleration that actually works, but only if you play the videos in a supporting player like MPC-BE. It won't help in a web browser.

Reply 5 of 18, by Private_Ops

User metadata
Rank Member
Rank
Member
shamino wrote:
When I compared an HD2600XT vs a 7600GS (AGP models), my impression was that the difference gets dramatic when you move to later […]
Show full quote

When I compared an HD2600XT vs a 7600GS (AGP models), my impression was that the difference gets dramatic when you move to later shader heavy games. The examples I tried were Fallout 3 and Skyrim. With older games the difference was less. As for the games you mentioned, I have no idea.
For some reason, the 7600GS was significantly faster in Flatout. I think that's an aberration.
I think the design of the HD2600XT was heavily biased in favor of shader performance, which makes sense for the time when it came out.

Something I like about the HD2600XT is that it supports H.264 video acceleration that actually works, but only if you play the videos in a supporting player like MPC-BE. It won't help in a web browser.

I didn't realize a 2600XT was new enough to run Fallout 3 (I'll admit, I first played it on console). Think a 256 or 512 model would be better? I can get a 512 for a pretty good price but, I'm not sure what speed the VRAM is.. I may grab one anyone just to see (and it's nice to have a somewhat modern back up card).

Reply 7 of 18, by dexvx

User metadata
Rank Oldbie
Rank
Oldbie

The Opteron 144 was released in late 2005. So an 'in-era' GPU would be a Radeon X1800XT/X1900 XT. They can be had for quite cheap, still. R600 (HD 2xxx series) was heavily delayed, not shipping until mid-late 2007 and was basically the FX5800 for ATI.

Reply 8 of 18, by Private_Ops

User metadata
Rank Member
Rank
Member
dexvx wrote:

The Opteron 144 was released in late 2005. So an 'in-era' GPU would be a Radeon X1800XT/X1900 XT. They can be had for quite cheap, still. R600 (HD 2xxx series) was heavily delayed, not shipping until mid-late 2007 and was basically the FX5800 for ATI.

I say "era" but, I used that opty for a long time. I don't believe I replaced it untill 2008 or so. My interpretation of "era" is rather broad.

Reply 9 of 18, by shamino

User metadata
Rank l33t
Rank
l33t
Private_Ops wrote:

I didn't realize a 2600XT was new enough to run Fallout 3 (I'll admit, I first played it on console). Think a 256 or 512 model would be better? I can get a 512 for a pretty good price but, I'm not sure what speed the VRAM is.. I may grab one anyone just to see (and it's nice to have a somewhat modern back up card).

It's been a long time since I tried it, but I remember thinking the performance was decent and playable on medium settings. However, with a motherboard that has an Express slot, you don't really have to stop at the 2600XT. I was playing around with that card because it's the most powerful AGP card that I have, and I was in a mood to push a P4 AGP box.
I have some fondness for testing newer games on older hardware, just to see how well it can work.
A problem with Fallout 3 is that when using marginal hardware, the "best" detail settings are a moving target. If you're indoors, the details can be set higher. Outdoors, they need to be lower to get the same frame rate. If you get into combat, suddenly you need a lot more smoothness and so the details should be even lower. That game really could have used some features to automatically adjust details based on need.

Definitely 512MB for a 2600XT. If you end up pushing the card then you'll want the extra RAM. Fallout 3 was an example - when I tested that game with a 256MB 7600GS, it seemed pretty clear that the game was struggling to manage the available RAM on the card. I was running in a loop in the same area, timing the framerate. Even though I was covering the same area over and over, the game would keep pausing for a few seconds at the same spot. After this happened a few times, either the game would crash, or if I got lucky it would get itself in a happy place where the pausing stopped and it was stable forever.
When I tested with the 512MB 2600XT, those pauses no longer happened and the game was more stable. So I strongly suspect the RAM size on the card was the difference there.

Reply 11 of 18, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
candle_86 wrote:

Its a little slower than a 7600GT so between an 8600GS and an 8600GT

Based on AGP tests I did :
With Fillrate limited scenarios - yes, HD 2600 XT is a bit slower than 7600 GT (Quake 3 Arena), and I'm talking about 512MB/GDDR3 version here for Radeon card.

With shader based games... it depends on game engine.
It can crush 7900 GS in Crysis, while being slower than 7600 GT in Doom 3.
Tested with PDC 4GHz/C2E9770 3,8GHz+.

157143230295.png

Reply 13 of 18, by kanecvr

User metadata
Rank Oldbie
Rank
Oldbie

Dunno about the HD2600XT, but the 2900XT with latest drivers is on par with a 8800GTX in STALKER and slower in DOOM3. Using older period correct drivers the 2900XT is slower then an 8800GTS in both STALKER and DOOM3. The 2900XT is also remarkably usable in modern(ish) games, while the 8800 series struggles to keep up. In older games the 8800 destroys the 2900.

Reply 14 of 18, by matze79

User metadata
Rank l33t
Rank
l33t

Isnt the HD3650 simply a refresh of HD2600 ?

https://www.retrokits.de - blog, retro projects, hdd clicker, diy soundcards etc
https://www.retroianer.de - german retro computer board

Reply 15 of 18, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yes but they never clocked the 3650 to the same level as 2600XT for some reason.

I'd like to see some comparative tests of 8800GTX versus 2900XT in these modernish games with newer drivers. 2900XT might have some amount of shader throughput advantage, but it definitely has a fillrate deficit. 8800GTX has a lot more fillrate, especially with Z fill (great for stencil shadow games). Radeons get more Z-fill friendly with 4800 and later. Could be the 8800's 768MB RAM isn't enough in newer games and you get some speed killing swapping. But then a 512MB 2900XT should be even worse off. Gotta get the 1GB 2900XT 😀.

Reply 16 of 18, by havli

User metadata
Rank Oldbie
Rank
Oldbie

I seriously doubt R600 can significantly beat G80... anywhere at all. This test might be a little outdated (and I plan to redo i in the future), but the performance figures are not far from the truth. OC variants of 8800 Ultra might even get very close to HD 4870 in some games. http://hw-museum.cz/article/3/benchmark-vga-2 … 2012-edition-/1

Lets see:

HD 4890 OC @ 950/4500, 1GB
HD 2900 XT @ 742/1660, 512MB
HD 2600 XT @ 800/2200, 256MB GDDR4
X1900 XTX @ 650/1550, 512MB

GTX 285 @ 650/2500, 1GB
8800 GTX @ 575/1800, 768MB
8600 GTS @ 675/2000, 256MB
7950 GT OC @ 600/1500, 512MB
6800 Ultra @ 400/1100, 256MB
-----------------------

Doom 3 - ultra, 1600x1200, noAA, 16xAF / 4xAA, 16xAF

HD 4890 OC = 308 / 255 fps
HD 2900 XT = 193 / 115 fps
HD 2600 XT = 63 / 36 fps
X1900 XTX = 111 / 75 fps

GTX 285 = 321 / 214 fps
8800 GTX = 216 / 121 fps
8600 GTS = 75 / 43 fps
7950 GT OC = 129 / 69 fps
6800 Ultra = 70 / 72 fps

COD4 - max, 1600x1200, noAA, 16xAF / 4xAA, 16xAF

HD 4890 OC = 164 / 122 fps
HD 2900 XT = 70 / 41 fps
HD 2600 XT = 28 / 16 fps
X1900 XTX = 53 / 38 fps

GTX 285 = 180 / 147 fps
8800 GTX = 97 / 76 fps
8600 GTS = 33 / 24 fps
7950 GT OC = 27 / 22 fps
6800 Ultra = 11 / 8 fps

Far Cry 2 - ultra DX9, 1600x1200, noAA, 16xAF / 4xAA, 16xAF

HD 4890 OC = 53 / 43 fps
HD 2900 XT = 27 / 18 fps
HD 2600 XT = 10 / 6 fps
X1900 XTX = 0 / 0 fps

GTX 285 = 66 / 54 fps
8800 GTX = 43 / 32 fps
8600 GTS = 9 / 5 fps
7950 GT OC = 0 / 0 fps
6800 Ultra = 0 / 0 fps

STALKER: Shadow of Chernobyl - max, 1600x1200, noAA, 16xAF / 4xAA, 16xAF

HD 4890 OC = 92 / 70 fps
HD 2900 XT = 40 / 16 fps
HD 2600 XT = 17 / 0 fps
X1900 XTX = 28 / 0 fps

GTX 285 = 96 / 37 fps
8800 GTX = 65 / 23 fps
8600 GTS = 21 / 0 fps
7950 GT OC = 25 / 0 fps
6800 Ultra = 7 / 0 fps

Mafia 2 - max, 1600x1200, noAA, 16xAF / 4xAA, 16xAF

HD 4890 OC = 62 / 35 fps
HD 2900 XT = 31 / 18 fps
HD 2600 XT = 12 / 0 fps
X1900 XTX = 0 / 0 fps

GTX 285 = 68 / 46 fps
8800 GTX = 36 / 24 fps
8600 GTS = 13 / 0 fps
7950 GT OC = 0 / 0 fps
6800 Ultra = 0 / 0 fps

Crysis 2 - extreme DX9, 1600x1200, noAA, 16xAF

HD 4890 OC = 40 fps
HD 2900 XT = 17 fps
HD 2600 XT = 0 fps
X1900 XTX = 0 fps

GTX 285 = 48 fps
8800 GTX = 25 fps
8600 GTS = 0 fps
7950 GT OC = 0 fps
6800 Ultra = 0 fps
---------------------------

I think Mafia 2, Crysis 2 or Far Cry 2 are all "modern enough"... and still 2900XT has no chance of even catching the 8800 GTX, let alone "destroying". So here goes that one 🤣 Later (in quite distant future), I'll do some tests using windows 7 and DX10 games... but still, I see no chance of 2900 XT winning.

HW museum.cz - my collection of PC hardware

Reply 17 of 18, by kanecvr

User metadata
Rank Oldbie
Rank
Oldbie
havli wrote:

Doom 3 - ultra, 1600x1200, noAA, 16xAF / 4xAA, 16xAF

HD 2900 XT = 193 / 115 fps
8800 GTX = 216 / 121 fps

^^These are consistent with my test results

havli wrote:

STALKER: Shadow of Chernobyl - max, 1600x1200, noAA, 16xAF / 4xAA, 16xAF

HD 2900 XT = 40 / 16 fps
8800 GTX = 65 / 23 fps

These are not. Here are my results:
STALKER @ 1920x1080 / Ultra / 16AF / noAA

8800GTX (EVGA) 576 core / 1800Mhz vram - 768MB = ~57 FPS
8800GTX@Ultra 612 core / 2150MHz vram - 768MB (PNY, overclocked to ultra clocks) = ~61 FPS

2900XT (Sapphire) 742 core / 1656Mhz vram - 512MB GDDR3 = ~52 FPS
2900XT (HIS) 743 core /2042Mhz vram - 1024MB GDDR4 = 58 FPS

STALKER 1600x1200 Ultra 16x AF noAA

8800GTX (EVGA) 576 core / 1800Mhz vram - 768MB = 68 FPS
8800GTX@Ultra 612 core / 2150MHz vram - 768MB (PNY, overclocked to ultra clocks) = ~72 FPS

2900XT (Sapphire) 742 core / 1656Mhz vram - 512MB GDDR3 = 62 FPS
2900XT (HIS) 743 core /2042Mhz vram - 1024MB GDDR4 = 70 FPS

I used this: http://www.guru3d.com/files-details/s-t-a-l-k … -benchmark.html for benchmarking.

The cards are so close that re-running the benchmarks can put the 2900XT on top and the GTX on the bottom, but 3 times out of 5 the GTX scores on top.

Test setup:

Intel Q6600 OC to 3GHz
4GB Corsair XMS3 1333MHz CL9
Asus P5K64-WS
WinXP SP3

Maybe the AA is killing the 2900XT's performance? With some earlier driver versions AA had a pronounced performance impact.

I'll dig for the screenshots and attach them here.

Reply 18 of 18, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I wouldn't suggest benching anything with forced AA. You don't know what the driver is doing. STALKER SoC for example uses deferred shading, which is incompatible with D3D9 MSAA. That's why the game has that lame old shader-based edge AA that barely does anything. 😀 Like Dead Space and other deferred rendering games do too. So who knows what the driver is doing to bring you forced "MSAA". It might be supersampling something. It might be breaking the game's visuals in some manner too. And it may not be the same across both AMD and NV.

STALKER Clear Sky and Call of Pripyat on the other hand implement DirectX 10/10.1/11 MSAA.
https://www.bit-tech.net/bits/interviews/2010 … ment-and-dx11/3