VOGONS


HD 2900XT

Topic actions

First post, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I bought a 1GB GDDR4-equipped HD 2900XT for $30 the other day. I have played around with it quite a bit on a nForce4 setup. It's an interesting card. The mystery as to why it has 128GB/s memory bandwidth, needs an 8-pin+6-pin power setup, and yet can't match an 8800GTS tickles my brain.

Some things I've noticed-
- Bioshock 2 will not allow the DirectX 10 detail surfaces. (on Win7) I guess it's forcing D3D9 mode.
- Oblivion stutters badly with drivers newer than ~10.6 on XP. Could be nForce4 related.
- Finding drivers for the Rage Theater 200 chip is a trick. They call them the ATI WDM Integrated drivers. Some actually cause BSODs. Love that ATI driver department.
- HDMI audio drivers are not always included (Realtek has them for DL though)
- MSAA does hit performance hard as reviews showed.

An Arctic Cooling Accelero S1 + 5v 120mm fan makes this a silent card. The ATI blower sounds like a hair dryer even at the desktop in 2D clocks.

m5m0TXBl.jpg
nt3Rdm9l.jpg

Reply 1 of 40, by F2bnp

User metadata
Rank l33t
Rank
l33t

These were pretty dark times for ATi. The 3850/3870 they released a while later were an improvement, but then the 8800GT happened. They only really managed to get back up when they released the 4850/4870 in the summer of 2008.

Reply 2 of 40, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

The mystery as to why it has 128GB/s memory bandwidth, needs an 8-pin+6-pin power setup, and yet can't match an 8800GTS tickles my brain.

They should run just as well with 6+6 pin PEG. But they are picky about PSUs I heard. Just bought DDR3 one for me as well, but it does not post for me 😢 Watching later reviews, they were slowly catching up to as time went by.

Reply 3 of 40, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:

They should run just as well with 6+6 pin PEG. But they are picky about PSUs I heard. Just bought DDR3 one for me as well, but it does not post for me 😢 Watching later reviews, they were slowly catching up to as time went by.

It does run with dual 6-pin power yes. I read that you need 8-pin for it to allow overclocking though.

Reply 4 of 40, by NJRoadfan

User metadata
Rank Oldbie
Rank
Oldbie

This card has the Rage Theater 200? What does it use it for? I thought only the ATI made All-in-Wonder cards had the chip (for VIVO). Getting video capture working on the chip in Windows 7 (Assuming that is what its for) is impossible. You have to use XP.

The only luck I had with the T200 is with an AGP AIW 9600XT and XP drivers in 7 32-bit. I couldn't get them working with my PCIe AIW X800. That is of course, unless ATI changed how the T200 chip interfaces with the rest of the video card. Seeing this card was released in 2007, I would hope they would have made everything Vista compatible!

Reply 5 of 40, by swaaye

User metadata
Rank l33t++
Rank
l33t++
NJRoadfan wrote:

This card has the Rage Theater 200? What does it use it for? I thought only the ATI made All-in-Wonder cards had the chip (for VIVO). Getting video capture working on the chip in Windows 7 (Assuming that is what its for) is impossible. You have to use XP.

The only luck I had with the T200 is with an AGP AIW 9600XT and XP drivers in 7 32-bit. I couldn't get them working with my PCIe AIW X800. That is of course, unless ATI changed how the T200 chip interfaces with the rest of the video card. Seeing this card was released in 2007, I would hope they would have made everything Vista compatible!

Yup it has Theater 200. It has a mini DIN breakout port between the DVI ports that apparently supports S-Video, component and composite output.

I haven't tried to find Win7 drivers. For XP, I have 10.2 legacy WDM drivers that don't BSOD. Yeah one would think that it must have supported Vista at some point. I haven't looked into that either though.

Reply 6 of 40, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
F2bnp wrote:

These were pretty dark times for ATi. The 3850/3870 they released a while later were an improvement, but then the 8800GT happened. They only really managed to get back up when they released the 4850/4870 in the summer of 2008.

Are you sure about 8800GT? As far as I remember, 8800 appeared when ATi's current model line was x1000. I was picking parts for my then-new PC and went for x1800GTO, 8800 was twice or thrice as expensive.

Reply 7 of 40, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah 8800GTX/GTS launched in late 2006 and were competing with X1950XTX for quite awhile. HD 2900 was late. The silicon needed an additional spin IIRC.

8800GT (G92) and 38x0 (RV670) came late in 2007. 8800GT was faster than 2900XT and as such is faster than 38x0 as well.

Reply 8 of 40, by F2bnp

User metadata
Rank l33t
Rank
l33t
RacoonRider wrote:
F2bnp wrote:

These were pretty dark times for ATi. The 3850/3870 they released a while later were an improvement, but then the 8800GT happened. They only really managed to get back up when they released the 4850/4870 in the summer of 2008.

Are you sure about 8800GT? As far as I remember, 8800 appeared when ATi's current model line was x1000. I was picking parts for my then-new PC and went for x1800GTO, 8800 was twice or thrice as expensive.

You are confusing the release dates.

8800GTX was released in November 2006 and 8800GTX 320MB/640MB variants were released shortly afterwards. HD 2900XT was released in May 2007. AMD was in such a bad position, that they released the 3850 and 3870 in November of 2007! By this point, however, the G80 (8800GTX) had seen a die shrink with the G92 and the 8800GT and 8800GTS 512 (an entirely different beast compared to its predecessors) were released between October-November 2007.
The 8800GT was and still is heralded as one of the greatest value for money videocards of all time. Performance was very close to an 8800GTX and the card cost roughly 250$. AMD had to price the 3870 competitively, so its price closely matched the 8800GT, if I'm remembering things correctly. However, it certainly wasn't as fast for the most part.

-swaaye beat me to the punch 🤣

Reply 9 of 40, by Skyscraper

User metadata
Rank l33t
Rank
l33t

the HD3870 was priced similar to Geforce 8800GT 256MB while the 8800GT 512MB was priced somewhat higher.

The HD3870 best selling point was the fact that it was actually possible to buy it...

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 10 of 40, by F2bnp

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:

the HD3870 was priced similar to Geforce 8800GT 256MB while the 8800GT 512MB was priced somewhat higher.

The HD3870 best selling point was the fact that it was actually possible to buy it...

That's true 🤣 . However, once supply stopped being an issue, 8800GT was a much better card. I remember owning a 7800GTX 256MB at that point and I was thinking about getting an HD3850 for relatively little money, but then the 4850 hit the market for the same amount and I was just sold. I think the GTX 280 was already out at this point though, so AMD still didn't have the performance crown, not that it really matters though.

Reply 11 of 40, by havli

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

- Oblivion stutters badly with drivers newer than ~10.6 on XP. Could be nForce4 related.
- MSAA does hit performance hard as reviews showed.

I can confirm these two issues. Stuttering isn't NF4 related, it is happening on Intel P67 + i5 2500k as well.... and not only HD 2900 XT - all Radeon HD 2000 / 3000 / 4000 / 5000 / 6000 suffer from this problem - at least up to Catalyst 12.4 driver.
MSAA is slow due to some kind of a bug in R600 architecture - all HD 2000 and 3000 chips are affected by this.

HW museum.cz - my collection of PC hardware

Reply 12 of 40, by swaaye

User metadata
Rank l33t++
Rank
l33t++
havli wrote:
swaaye wrote:

- Oblivion stutters badly with drivers newer than ~10.6 on XP. Could be nForce4 related.
- MSAA does hit performance hard as reviews showed.

I can confirm these two issues. Stuttering isn't NF4 related, it is happening on Intel P67 + i5 2500k as well.... and not only HD 2900 XT - all Radeon HD 2000 / 3000 / 4000 / 5000 / 6000 suffer from this problem - at least up to Catalyst 12.4 driver.
MSAA is slow due to some kind of a bug in R600 architecture - all HD 2000 and 3000 chips are affected by this.

I don't think I've seen this stutter on 3000 and newer cards with newer drivers. I usually run Catalyst 13.1 Legacy on Win 7. It is unplayable with 2600 and 2900. Maybe it is XP related....

The AA thing may have been a hardware flaw. However ATI definitely played it as intended design. Part of the MSAA process is performed in the shader array instead of the RBE units as before (and later chips). I don't know if this is a bottleneck. It seems more like the 2000/3000 series just has insufficient fill rate in general. AA consumes more of the limited resource. HD 4800 more than doubled various fill rates.

See this
Sir Eric Demers on AMD R600

Reply 13 of 40, by swaaye

User metadata
Rank l33t++
Rank
l33t++
F2bnp wrote:
Skyscraper wrote:

the HD3870 was priced similar to Geforce 8800GT 256MB while the 8800GT 512MB was priced somewhat higher.

The HD3870 best selling point was the fact that it was actually possible to buy it...

That's true 🤣 . However, once supply stopped being an issue, 8800GT was a much better card. I remember owning a 7800GTX 256MB at that point and I was thinking about getting an HD3850 for relatively little money, but then the 4850 hit the market for the same amount and I was just sold. I think the GTX 280 was already out at this point though, so AMD still didn't have the performance crown, not that it really matters though.

Paid $300 for a EVGA 8800GT 512MB for my brother's PC upgrade at the time. There's no doubt it was the card to get. And it was supported much better than 3870 in the long run.

I had to replace the cooler on the 8800GT too stabilize it though. Those cards tend to run over 100C in games and over time I think it kills them. My bro got 2 replacements I think. All of the 8000 cards have that defective ROHS solder. I still have his last 8800GT though. It's rock solid with an Accelero S1 on it.

Reply 14 of 40, by obobskivich

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

I bought a 1GB GDDR4-equipped HD 2900XT for $30 the other day. I have played around with it quite a bit on a nForce4 setup. It's an interesting card. The mystery as to why it has 128GB/s memory bandwidth, needs an 8-pin+6-pin power setup, and yet can't match an 8800GTS tickles my brain.

Very cool card - especially with GDDR4. As far as the memory bandwidth and power draw - it actually goes hand in hand. R600 uses a 512-bit memory bus, which makes for a properly massive chip with massive power demands (I remember reading an article about this on either Guru3D or Anandtech, that the primary space/power savings seen with HD 3870 were simply due to chucking the 512-bit memory - it dropped the chip size by something like 30-40% IIRC). The GDDR4 itself also does the card no favors. As far as why they insisted on that feature, my guess would be they were hoping to avoid memory bottlenecks and/or offer more compute performance (remember, they had beaten nVidia out of the gate with GPU compute with the X1800/1900 series). IIRC there are actually a handful of specific cases where the 2900XT can/did end up on top over the 8800GTS, 3870, etc because of its memory bandwidth, but these are few and far between, and (afaik) it never achieves a comparable power-to-performance ratio.

F2bnp wrote:

8800GTX was released in November 2006 and 8800GTX 320MB/640MB variants were released shortly afterwards. HD 2900XT was released in May 2007. AMD was in such a bad position, that they released the 3850 and 3870 in November of 2007! By this point, however, the G80 (8800GTX) had seen a die shrink with the G92 and the 8800GT and 8800GTS 512 (an entirely different beast compared to its predecessors) were released between October-November 2007.
The 8800GT was and still is heralded as one of the greatest value for money videocards of all time. Performance was very close to an 8800GTX and the card cost roughly 250$. AMD had to price the 3870 competitively, so its price closely matched the 8800GT, if I'm remembering things correctly. However, it certainly wasn't as fast for the most part.

To untangle this further/add:

- The 320MB and 640MB cards were not 8800GTX, they were 8800GTS. There is also a G92-based 8800GTS ("8800GTS 512MB") which is a separate, later entity. The 8800GTX (and later Ultra) were 768MB boards (there's also a 1.5GB Quadro variant, the FX 5600, which competed with the absolutely absurd FireGL V8650).

- The R600 is arguably not a fully AMD design - it was released only months after the merger with ATi. The HD 3850/3870 series were a die shrink and internal optimization of the R600 (they also gained UVD and some other features), and while they lost some memory bandwidth, 3870 tends to hold its own quite well against 2900XT. Price-wise it was around $200-$250 from what I remember as well - similar to the 8800GT/GTS and later 9800GT/GTX.

- The G92 (8800GT/GTS, 9800, GTS 250, etc) is not directly a die-shrink of the G80 (8800GTX/Ultra/GTS) - it added PureVideo HD (e.g. full h.264 decoding), and some other improvements (e.g. CUDA 1.1). It's very similar in theme to what AMD did with the 3850/3870 respin, in that new features/improvements were added as well.

- AMD did reclaim the performance crown in 2008, with the HD 4870X2 (at least until GTX 295 showed up and had a very short-lived reign at the top before HD 5870 and 5970 were released). HD 3870X2 was their high-end competitor for the 8800GTX/Ultra and 9800 series, and it did quite well - it was a much better ~$400-$500 card than 2900XT, and used about the same amount of power as the 2900XT. 😲

Something else to remember in these comparisons is how much CrossFire evolved - the X1900 series generally relied on the Master/Slave configuration with hardware compositing on the Master card (except the X1950Pro, which introduced the internal bridge). R600 used the internal bridge, and the move to HD 3800 series brought about CrossFire X, with very flexible 2-4 way configurations, multi-GPU cards, and so forth.

swaaye wrote:

The AA thing may have been a hardware flaw. However ATI definitely played it as intended design. Part of the MSAA process is performed in the shader array instead of the RBE units as before (and later chips). I don't know if this is a bottleneck. It seems more like the 2000/3000 series just has insufficient fill rate in general. AA consumes more of the limited resource. HD 4800 more than doubled various fill rates.

See this
Sir Eric Demers on AMD R600

Everything I've read is that R600 was basically rushed out to market and had lots of broken/incomplete features - it was late and it would've/should've been much later. HD 3870 is probably a better reflection of what ATi had intended to do with R600. I've always kind of thought of R600 as "ATi's NV30" - they were months behind schedule and just went with what they had, and it didn't end up well. And like NV30, they were more than happy to abandon it (and its driver support) at the earliest possible convenience.

Something I've wondered about, and never read one way or another on, is does R600 have any problem with solder joints like the GF7/8 cards can? They certainly run hot enough to exacerbate such problems, but they're probably obscure/obtuse enough that few people would've run into problems (honestly I don't remember anyone I knew circa 2006-7 buying an R600; I've only tended to see them nowadays purchased as curiosities 🤣).

Reply 15 of 40, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
NJRoadfan wrote:

This card has the Rage Theater 200? What does it use it for? I thought only the ATI made All-in-Wonder cards had the chip (for VIVO). Getting video capture working on the chip in Windows 7 (Assuming that is what its for) is impossible. You have to use XP.

Quite a few ATi cards of that era had that chip, and had VIVO through their s-video port.
I have an X1800XT and X1900XTX which both have it.
The All-in-wonder cards also had a TV tuner, these cards do not.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 16 of 40, by nforce4max

User metadata
Rank l33t
Rank
l33t

Those were dark days when it came to the drivers, performance was okish but AMD even now takes forever to make half ass drivers. I've given up on AMD a long time ago when it comes to driver support.

On a far away planet reading your posts in the year 10,191.

Reply 17 of 40, by NJRoadfan

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

I didn't say the Rage Theater 200 wasn't older than that. Just that cards from that era often had VIVO as standard. Afaik the 9500/9600/9700/9800 did not have this. Pretty sure my 9600XT didn't anyway, it did not come with the required cable, and I've never seen the chip in Device Manager.
As I recall, you had to have the AIW version in that era, in which case it had the RT200 chip indeed.

At most the non-AIW 9000/X000 cards came with video output. You had to buy an All-in-Wonder to get full VIVO in that era. The Theater 200 is an oldie, but a goodie, its one of the best analog video capture chips out there. Thats why I'm curious if ATI changed how the T200 chip interfaced with the PCI bus with the HD2900. ATI/AMD stated that the older AIW cards will never get video capture capability in Vista and above because the WDDM driver architecture was incompatible with how they setup/connected the T200 chip on the older cards.

Reply 19 of 40, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Ok I can't seem to replicate the Deus Ex Human Revolution black smoke issue I had the other day. It seems to work fine now in both XP and 7. I haven't a clue what was going on before.

SPBHM wrote:

interesting graph I thought I should add

SUPER HUGE MARKETSHARE CHART

What strikes me about this is how the seemingly popular 4800 series didn't put much of a dent in NV's marketshare. They even priced those cards in an obvious attempt to grab marketshare.