VOGONS


First post, by Retroinside

User metadata
Rank Newbie
Rank
Newbie

I’ve been experimenting a bit with my MSI P35 Platinum and a Q6600, and I wanted to share a quick comparison between two cards from roughly the same era:

  • ATI Radeon HD 2900 Pro 1024MB GDDR4 🟥
    NVIDIA GeForce 8800 GTS 512MB (G92) 🟩

As far as I remember, the HD 2900 Pro shipped with lower clocks in order to reduce power consumption, temperatures, and to somehow “fix” what had been considered a half-failure with the 2900 XT.

On the other side, the 8800 GTS 512MB introduced the G92 GPU (a die-shrink of the G80), which combined some features of the older GTS and GTX, but with improved efficiency and higher clocks thanks to the smaller process.

In my tests, I pushed the Q6600 B3 to 3.6 GHz using a 1600 MHz FSB , which isn’t bad for air cooling, paired with 2GB Geil DDR2 800 CL4-4-4-12 .

In real-world performance the two cards aren’t worlds apart — the 8800 GTS often held its ground surprisingly well 👍.
Still, the HD 2900 was terrifyingly ahead when it came to memory 💥, not only because of its massive 512-bit bus paired with GDDR4, but also because it shipped with twice the VRAM (1GB vs 512MB).

This was just a short test in 3D Mark 2003, with a couple of dusty cards I had lying around 🧹, but I thought it would be fun to share what they could do when pushed to their maximum (at least in my case) overclock.

Both cards easily break 750 MHz on the core, but neither can hold 800 MHz without voltage tweaks ⚠️.

On the memory side, the HD 2900 simply crushes the GTS thanks to its 512-bit memory bus and extra VRAM, effectively doubling the available bandwidth 🚀.
Screenshots are attached — enjoy! 📸

Of course, this is not meant to be a full or exhaustive benchmark session — rather, just a quick test with the hardware I had on hand 🖥️.

Reply 1 of 7, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

They very much are worlds apart, just enable Anti-Aliasing ;D
In actual games differences WILL be more pronounce vs. 3DMark (Far Cry, Crysis, FEAR, etc.).

From errors I see :

On the other side, the 8800 GTS 512MB introduced the G92 GPU (a die-shrink of the G80), which combined some features of the older GTS and GTX..

1) 8800 GT was first G92 card, 8800 GTS 512MB is first card that uses full chip (no cut SPs).
"Previous GTS" card was cut down GTX.

Still, the HD 2900 was terrifyingly ahead when it came to memory 💥, not only because of its massive 512-bit bus paired with GDDR4, but also because it shipped with twice the VRAM (1GB vs 512MB).

2) HD 2900 Pro uses 256-bit bus, not 512-bit. I also don't remember it using G4 memory ?
But that's maybe due to me not paying attention at the time.
512-bit bus can be seen only on HD 2900 XT model though (or FireGL versions of it).

Last edited by agent_x007 on 2025-08-16, 15:33. Edited 2 times in total.

Reply 2 of 7, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

2) HD 2900 Pro uses 256-bit bus, not 512-bit. I also don't remember it using G4 memory ?

All 1Gb GDDR4 2900 Pro are just rebadged 2900XT with lowered chip clock. Same 512-bit PCB and all. Obviously the same ridiculous power draw of XT cards compared to cost-reduced GDDR3 Pro versions, those had one 8-pin connector.

Last edited by The Serpent Rider on 2025-08-16, 15:38. Edited 1 time in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 3 of 7, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

I guess, I really didn't payed attention 😁
He is comparing a a bit downclocked HD 2900 XT to 8800 GTS 512MB then... thank you @The Serpent Rider.
Now those numbers have a bit more sense to me.

Reply 4 of 7, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

It doesn't matter either way. Radeon 2900 512-bit practically matches 3850 in clock-for-clock scenarios, so all that bandwidth is wasted for nothing.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 5 of 7, by Ash515253

User metadata
Rank Newbie
Rank
Newbie

something the Nvidia card wins hands down in 2025 is availability, I haven't seen any HD2900s for sale for a long time 😒

my website: https://ashsthingsandstuff.co.uk/

Reply 6 of 7, by Archer57

User metadata
Rank Oldbie
Rank
Oldbie
Ash515253 wrote on 2025-08-16, 19:06:

something the Nvidia card wins hands down in 2025 is availability, I haven't seen any HD2900s for sale for a long time 😒

Probably the result of poor price/performance/heat and not many people buying it back then, in combination of very hot cards cooking themselves to death...

AthlonXP 2200+,ECS K7VTA3 V8.0,1GB,GF FX5900XT 128MB,Audigy 2 ZS
AthlonXP 3200+,Epox EP-8RDA3I,2GB,GF 7600GT 256MB,Audigy 4
Athlon64 x2 4800+,Asus A8N32-SLI Deluxe,4GB,GF 8800GT 1GB,Audigy 4
Core2Duo E8600,ECS G31T-M3,4GB,GF GTX660 2GB,Realtek ALC662

Reply 7 of 7, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

2900XT was released in May 2007. It was ditched after 6 months in favor of the 3870 die-shrink. Still, it's not really that hard to find some sort of cost-reduced 2900 variant in working condition. AMD had to get rid of the chip stock somehow.

I must be some kind of standard: the anonymous gangbanger of the 21st century.