VOGONS


NV3x, R3x0, and pixel shader 2.0

Topic actions

Reply 40 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

5900XT runs various D3D7/8 games considerably faster than a 9600XT. That's why people thought it was a good deal. It's primarily what the reviews were covering at the time.

And then Tomb Raider:AOD, HL2 and 3DMark03 happened, and they were all in huge shock.
Apparently some people are still in shock even in 2015 😀

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 41 of 103, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie

I just want to remind everyone that the HL2 codebase was re-written and the game delayed for a while because their servers were hacked.

Before this happened, ATI was just about to release their 9700 Pro bundled with Half Life 2. Because the game was delayed however, a coupon was included with the card for when the game was released.

So they were in bed with ATI at the time, and maybe that's why the game behaved the way it did with the FX Series.

As for me, I always preferred nVidia cards for the simple reason that ATI cards did not perform faster at 16bpp because of the way they were designed ever since the first Radeon. Given the choice between playing a game at a lower resolution to preserve a high enough framerate or dropping it to 16bpp, I would choose dropping to 16bpp every time. It doesn't sound too appealing, because you will get banding at 16bpp, but not everyone is a millionaire, and I believe it's a very good compromise.

So the FX series was a heck of a better bargain, if for only this reason.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 42 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
mockingbird wrote:

So they were in bed with ATI at the time, and maybe that's why the game behaved the way it did with the FX Series.

Yup, as I say, Valve didn't sugarcoat it. Just like FutureMark didn't.
But they actually spent more time optimizing for GeForce FX than for R300 in the end, because the R300 just ran the initial SM2.0 code just fine, where they had to mix-and-match with PS1.4 to get the FX up to speed.

There's a bit of irony in the fact that nVidia now uses HL2 to promote their Shield:
http://shield.nvidia.com/games/android/halflife2

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 43 of 103, by leileilol

User metadata
Rank l33t++
Rank
l33t++
mockingbird wrote:

As for me, I always preferred nVidia cards for the simple reason that ATI cards did not perform faster at 16bpp because of the way they were designed ever since the first Radeon. Given the choice between playing a game at a lower resolution to preserve a high enough framerate or dropping it to 16bpp, I would choose dropping to 16bpp every time. It doesn't sound too appealing, because you will get banding at 16bpp, but not everyone is a millionaire, and I believe it's a very good compromise.

So the FX series was a heck of a better bargain, if for only this reason.

Many of the pixel shader games out there don't even give you a choice to go 16bpp. 16bpp has awful depth precision most of the time, and the Nvidia dither table pattern can get very obnoxious. It's a very, VERY ugly compromise.

Also I stopped using 16bpp anyway when the Geforce2 came out 😜 The better quality compromise for performance on a FX is actually dropping the resolution down to 320x240. It's far more effective than dropping to 16bpp.

apsosig.png
long live PCem

Reply 44 of 103, by obobskivich

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

The OpenGL.org forum had some people who really liked the 5200 as a cheap D3D9 development experimentation platform. An interesting angle on that chip.

For OS X too - the FX 5200 supports CoreImage and in some cases this means significant performance improvements over the GF4/Radeon 9000 that predated it. Other DX9-era cards were certainly also well regarded for this, but the FX 5200 was generally the cheapest and simplest option to provide CoreImage support for a Mac.

swaaye wrote:

I recall the 5900XT being perceived as a pretty solid value at the time, probably the best for the whole FX lineup. It's just a 5900 Ultra with lower clock speeds.

Half the RAM (and at somewhat lower speeds) too - and many cards will make the jump from 390-400 to 450MHz quite gracefully (mine will do around 550 😲 ), but taking the RAM from 700->850 is usually not feasible. 5900XT was one of the last FX cards to be released as well; I remember reading theories back in '03-'04 that XT was potentially just to eat up unsold NV35 dies (especially when many of the cards could do ~450MHz). There's a "5900 Vanilla" that sits in-between those two which was a ~$250-$300 (release SRP) part as well, but I don't think they were very popular (based on that they're not exceedingly common on ebay these days, and I don't remember hearing much about them or seeing them in reviews all that often). If I remember right their clocks are much closer to the 5900 Ultra, at something like 400/800.

The games of the time were still mostly a mix of D3D7/8 and the FX cards are fine with these. It wasn't overly apparent for awhile that the FX cards were really terrible at PS2.0. Some developers did tailor their games for the FX cards too. Far Cry and Doom3 run pretty well.

I would add that the whole "Half-Life 2 Controversy" and Radeon 9800 vs GeForce FX 5900 blood feud seems to have gotten fiercer and more contentious as time has gone on - I don't honestly remember so much drama about this "back in the day" nor do I remember Half-Life 2 being such a singular focus for many people. FPS gaming is not the entire scope of gaming, nor is Half-Life 2 the entire scope of FPS. Until GeForce FX is mentioned, and then it's just a non-stop barrage of how badly NV30 (actually usually NV35 - I have never seen a published online review of NV30 itself in Half-Life 2) does in Half-Life 2 and why GeForce FX is the worst thing since the draft and human sacrifice.

If you liked Bioware OpenGL games, you didn't really want to be on ATI back then. That's another thing to consider when it comes to perceptions here. Getting KOTOR or NWN working perfectly on ATI was a trick.

NWN1 requires/uses palletized textures (I say slash because I've read some reports that the Diamond edition removed this requirement, but I'm not certain of that), which R300 doesn't support. I'm not sure about KOTOR. Catalyst 3.x were largely not great drivers to live with for other reasons too - fog table is broken/not supported, multi-monitor is fairly limited, iirc there's issues with MPEG-decode acceleration (it isn't as plug-in-perfect as PureVideo from what I remember), etc. Later releases, like Catalyst 8.x, certainly work a lot better - but those weren't available for many years, and for Windows 9x some fixes never came (e.g. fog table). From a more "Vogons-centric" perspective, the GeForce FX (and GeForce 4) have an edge over the Radeon 9 due to drivers and hardware compatibility (e.g. all are universal AGP cards), and the DX9 performance question is largely not an issue, because like people in 2005 we too can get GeForce 7800GTX (or similar), or we can move up to something faster altogether (e.g. the last time I played Half-Life 2 it was on a GeForce GTX 660, at 1080p with HDR and full max settings, and it ran around 500 FPS).

swaaye wrote:

Far Cry has lots of tech in it and was patched a bunch of times too. I think it has NV30 and R4x0 optimizations. SM3.0 HDR was added. But with NV3x I think it mostly runs PS1.4. The console log reads it out.

There is an SM2.0b (R4xx) path that implements HDR and some other "SM3.0 features" for the X800/X850. I know there's a review floating around out there that has image quality comparisons between X850 and 6800/7800 with 2.0b and 2.0c - from what I recall the differences, at least to the naked eye, are very minor. I'm not sure about a DX9.0a path (that'd be NV3x's "beyond DX9" features ++ optimizations). AFAIK FarCry is one of the few games that actually bothered to implement a working 9.0b HDR path (working as in, afaik, it performs pretty well, at least for X800/850 - I'm not sure if it could be forced to work on something like 9800XT or not though (other X800 features have been at least), or how it'd work on something like the X600). And while all of that was neat - the constant non-cumulative patching for FarCry was decidedly not neat... 🤣 😵

Reply 45 of 103, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

The FX series of cards have one killer feature for me 🤣

They clone the output on VGA and DVI under DOS. They also got a good S-Video output and the DVI ports use EDID information. Other cards, Matrox for example, don't do this and output a signal that, at least with the gear I have, can't be captured.

YouTube, Facebook, Website

Reply 46 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
For OS X too - the FX 5200 supports CoreImage and in some cases this means significant performance improvements over the GF4/Rad […]
Show full quote
swaaye wrote:

The OpenGL.org forum had some people who really liked the 5200 as a cheap D3D9 development experimentation platform. An interesting angle on that chip.

For OS X too - the FX 5200 supports CoreImage and in some cases this means significant performance improvements over the GF4/Radeon 9000 that predated it. Other DX9-era cards were certainly also well regarded for this, but the FX 5200 was generally the cheapest and simplest option to provide CoreImage support for a Mac.

swaaye wrote:

I recall the 5900XT being perceived as a pretty solid value at the time, probably the best for the whole FX lineup. It's just a 5900 Ultra with lower clock speeds.

Half the RAM (and at somewhat lower speeds) too - and many cards will make the jump from 390-400 to 450MHz quite gracefully (mine will do around 550 😲 ), but taking the RAM from 700->850 is usually not feasible. 5900XT was one of the last FX cards to be released as well; I remember reading theories back in '03-'04 that XT was potentially just to eat up unsold NV35 dies (especially when many of the cards could do ~450MHz). There's a "5900 Vanilla" that sits in-between those two which was a ~$250-$300 (release SRP) part as well, but I don't think they were very popular (based on that they're not exceedingly common on ebay these days, and I don't remember hearing much about them or seeing them in reviews all that often). If I remember right their clocks are much closer to the 5900 Ultra, at something like 400/800.

The games of the time were still mostly a mix of D3D7/8 and the FX cards are fine with these. It wasn't overly apparent for awhile that the FX cards were really terrible at PS2.0. Some developers did tailor their games for the FX cards too. Far Cry and Doom3 run pretty well.

I would add that the whole "Half-Life 2 Controversy" and Radeon 9800 vs GeForce FX 5900 blood feud seems to have gotten fiercer and more contentious as time has gone on - I don't honestly remember so much drama about this "back in the day" nor do I remember Half-Life 2 being such a singular focus for many people. FPS gaming is not the entire scope of gaming, nor is Half-Life 2 the entire scope of FPS. Until GeForce FX is mentioned, and then it's just a non-stop barrage of how badly NV30 (actually usually NV35 - I have never seen a published online review of NV30 itself in Half-Life 2) does in Half-Life 2 and why GeForce FX is the worst thing since the draft and human sacrifice.

If you liked Bioware OpenGL games, you didn't really want to be on ATI back then. That's another thing to consider when it comes to perceptions here. Getting KOTOR or NWN working perfectly on ATI was a trick.

NWN1 requires/uses palletized textures (I say slash because I've read some reports that the Diamond edition removed this requirement, but I'm not certain of that), which R300 doesn't support. I'm not sure about KOTOR. Catalyst 3.x were largely not great drivers to live with for other reasons too - fog table is broken/not supported, multi-monitor is fairly limited, iirc there's issues with MPEG-decode acceleration (it isn't as plug-in-perfect as PureVideo from what I remember), etc. Later releases, like Catalyst 8.x, certainly work a lot better - but those weren't available for many years, and for Windows 9x some fixes never came (e.g. fog table). From a more "Vogons-centric" perspective, the GeForce FX (and GeForce 4) have an edge over the Radeon 9 due to drivers and hardware compatibility (e.g. all are universal AGP cards), and the DX9 performance question is largely not an issue, because like people in 2005 we too can get GeForce 7800GTX (or similar), or we can move up to something faster altogether (e.g. the last time I played Half-Life 2 it was on a GeForce GTX 660, at 1080p with HDR and full max settings, and it ran around 500 FPS).

swaaye wrote:

Far Cry has lots of tech in it and was patched a bunch of times too. I think it has NV30 and R4x0 optimizations. SM3.0 HDR was added. But with NV3x I think it mostly runs PS1.4. The console log reads it out.

There is an SM2.0b (R4xx) path that implements HDR and some other "SM3.0 features" for the X800/X850. I know there's a review floating around out there that has image quality comparisons between X850 and 6800/7800 with 2.0b and 2.0c - from what I recall the differences, at least to the naked eye, are very minor. I'm not sure about a DX9.0a path (that'd be NV3x's "beyond DX9" features ++ optimizations). AFAIK FarCry is one of the few games that actually bothered to implement a working 9.0b HDR path (working as in, afaik, it performs pretty well, at least for X800/850 - I'm not sure if it could be forced to work on something like 9800XT or not though (other X800 features have been at least), or how it'd work on something like the X600). And while all of that was neat - the constant non-cumulative patching for FarCry was decidedly not neat... 🤣 😵

To be honest back in 2003, Half Life 2 wasn't being looked at. It was Call of Duty, Halo, Spinter Cell, Serious Sam 2, and Jedi Outcast that really got paid attention to and to be honest except for Halo 2 it was pretty much a tie race, by 2004 the NV40 and R420 where out.

Reply 47 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++
obobskivich wrote:

NWN1 requires/uses palletized textures (I say slash because I've read some reports that the Diamond edition removed this requirement, but I'm not certain of that), which R300 doesn't support. I'm not sure about KOTOR.

For KOTOR on a 8500-9800, Catalyst 4.2 is the one to use. There seem to be some workarounds in that OpenGL ICD. You can enable soft shadows and I think some other things work best with that driver. I don't remember everything anymore...

From what I recall, Bioware didn't like to work with ATI devrel back then. But hey today we have events like the RAGE disaster. And frame time examinations by review sites that force ATI to make their drivers better. Sigh.

Reply 48 of 103, by obobskivich

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

To be honest back in 2003, Half Life 2 wasn't being looked at. It was Call of Duty, Halo, Spinter Cell, Serious Sam 2, and Jedi Outcast that really got paid attention to and to be honest except for Halo 2 it was pretty much a tie race, by 2004 the NV40 and R420 where out.

Exactly. I'd add Tomb Raider AoD, at least from a review/benchmark perspective, and UT2004 and other UE2.x games to the list too, but yeah. 😀

swaaye wrote:

For KOTOR on a 8500-9800, Catalyst 4.2 is the one to use. There seem to be some workarounds in that OpenGL ICD. You can enable soft shadows and I think some other things work best with that driver. I don't remember everything anymore...

From what I recall, Bioware didn't like to work with ATI devrel back then. But hey today we have events like the RAGE disaster. And frame time examinations by review sites that force ATI to make their drivers better. Sigh.

RAGE disaster?

Reply 49 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++
obobskivich wrote:

RAGE disaster?

Remember when RAGE was released and basically didn't run on Radeons for the first 2 weeks? It was very slow and had problems like missing textures and crashes. It took about 3 months for AMD to iron out the problems with it. I remember 4 hotfix drivers? It was as if they had done no OpenGL development for it until it was released. It was shocking, and even more amazing was during John Carmack's Quakecon talk that summer he said AMD was working diligently on getting everything optimized!

Fortunately I owned a 6950 and a 560 Ti.

Reply 50 of 103, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie

I remember loading that "Real Time HDR" demo on the FX 5900SE (at 450MHz or so) and the results was not even half of my previous card (9500NP 64MB at 370Mhz and 8 pipelines),

but at the time, for DX 8.1 up until Doom 3 I think the FX series was very competent, also vanilla Hl2 DX9 vs DX8.1, I don't think it made a massive difference...

I think the x800s lack of SM3.0 was more problematic.

Rage was a huge mess, it still is bad on my 5850, and I know people run it without problems on slower Geforces like the 8800GT

Reply 51 of 103, by obobskivich

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Remember when RAGE was released and basically didn't run on Radeons for the first 2 weeks? It took about 3 months for them to iron out the problems with it. It was as if they had done no OpenGL development for it until it was released.

D'oh! You meant the game RAGE. 😊 I was thinking of "Rage" as in ATi Rage 128... 🤣 🤣

Reply 52 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++
obobskivich wrote:

D'oh! You meant the game RAGE. 😊 I was thinking of "Rage" as in ATi Rage 128... 🤣 🤣

Man I don't know how ATI even survived their Rage card era. Talk about driver horrors... Thank OEM sales for their survival, no doubt.

SPBHM wrote:

Rage was a huge mess, it still is bad on my 5850, and I know people run it without problems on slower Geforces like the 8800GT

8800GT does run it very well. I've tried it. I remember it managing a fairly solid 60fps at 1680x1050. The 512MB RAM causes stuttering sometimes though.

Reply 53 of 103, by F2bnp

User metadata
Rank l33t
Rank
l33t

The RAGE release was terrible. Tried the game at a friend's 5770 and it would just hang after the first few minutes. I don't know if AMD or id software (or both) are to blame for that.

I thought the game was pretty shallow and meh though, so no big worries there 😊 .

Reply 54 of 103, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:
obobskivich wrote:

D'oh! You meant the game RAGE. 😊 I was thinking of "Rage" as in ATi Rage 128... 🤣 🤣

Man I don't know how ATI even survived their Rage card era. Talk about driver horrors... Thank OEM sales for their survival, no doubt.

The Rage128 sold well because most people still didn't really care about 3D acceleration. And for a card that initially had non-existent 3D performance, I'm always very impressed at how ATI drastically improved its 3D acceleration via the drivers towards its end.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 55 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:

I would add that the whole "Half-Life 2 Controversy" and Radeon 9800 vs GeForce FX 5900 blood feud seems to have gotten fiercer and more contentious as time has gone on - I don't honestly remember so much drama about this "back in the day" nor do I remember Half-Life 2 being such a singular focus for many people. FPS gaming is not the entire scope of gaming, nor is Half-Life 2 the entire scope of FPS. Until GeForce FX is mentioned, and then it's just a non-stop barrage of how badly NV30 (actually usually NV35 - I have never seen a published online review of NV30 itself in Half-Life 2) does in Half-Life 2 and why GeForce FX is the worst thing since the draft and human sacrifice.

No, the real issue here is that NV30 sucked in ALL SM2.0-games. Tomb Raider and 3DMark03 were also examples mentioned that showed pretty much the same performance issues as HL2. And there were more.
But somehow these are being ignored, and it is pretended that HL2 is the only one, as if it is some kind of exception, while it is actually the rule. This is revisionist history, and I will not have any of it.

And as said, games, including HL2 had alternative NV30-paths to get performance up (at the cost of image quality). So framerate is not the only metric you should be looking at, if you solely want to base your NV30-judgement on games.
Because if you ask me, if NV30 gets the same framerates as R300, but it does so with considerably reduced image quality (basically last-gen quality), it's still bad value for money.

Last edited by Scali on 2015-07-14, 10:24. Edited 2 times in total.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 56 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

It took about 3 months for AMD to iron out the problems with it. I remember 4 hotfix drivers? It was as if they had done no OpenGL development for it until it was released.

What I found most shocking is that apparently Carmack never bothered to run the game on AMD hardware at all, and just released it like that.
I mean, we have two major GPU vendors, it is not THAT difficult to do some QA on both of them. If they had bothered to actually run the game on AMD hardware, they would have seen that it was unplayable, and they should have contacted AMD about it, and delayed the launch until it was fixed.
This is the only time I have ever seen a game launched that simply didn't work on any card of one of the vendors.
I mean, performance issues are one thing, but this was completely unplayable (I had a machine with a Radeon 5770 at the time. Another machine had a GTX460 I believe, and it worked fine even without installing the newly released drivers for RAGE).

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 57 of 103, by Skyscraper

User metadata
Rank l33t
Rank
l33t

In Sweden the FX series diddnt sell at all except for the FX5200 and FX5500 which were sold with OEM systems while the Radeon 9xxx series found its way into every gamers computer. How do I know this? I often buy lots with random AGP cards people have looted at the recycling centers. I have countless of Radeon 9500/9600/9700/9800 cards but very few high end FX cards.

So even if the FX wasnt as bad as it is, the rumor of its crappy performance spread fast.

On the other hand I really like my Geforce FX 5900 Ultra for its kickass DX7 and DX8 performance in Windows 98.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 58 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
Scali wrote:
No, the real issue here is that NV30 sucked in ALL SM2.0-games. Tomb Raider and 3DMark03 were also examples mentioned that showe […]
Show full quote
obobskivich wrote:

I would add that the whole "Half-Life 2 Controversy" and Radeon 9800 vs GeForce FX 5900 blood feud seems to have gotten fiercer and more contentious as time has gone on - I don't honestly remember so much drama about this "back in the day" nor do I remember Half-Life 2 being such a singular focus for many people. FPS gaming is not the entire scope of gaming, nor is Half-Life 2 the entire scope of FPS. Until GeForce FX is mentioned, and then it's just a non-stop barrage of how badly NV30 (actually usually NV35 - I have never seen a published online review of NV30 itself in Half-Life 2) does in Half-Life 2 and why GeForce FX is the worst thing since the draft and human sacrifice.

No, the real issue here is that NV30 sucked in ALL SM2.0-games. Tomb Raider and 3DMark03 were also examples mentioned that showed pretty much the same performance issues as HL2. And there were more.
But somehow these are being ignored, and it is pretended that HL2 is the only one, as if it is some kind of exception, while it is actually the rule.

And as said, games, including HL2 had alternative NV30-paths to get performance up (at the cost of image quality). So framerate is not the only metric you should be looking at, if you solely want to base your NV30-judgement on games.
Because if you ask me, if NV30 gets the same framerates as R300, but it does so with considerably reduced image quality, it's still bad value for money.

except during the actual time when the card mattered you had no real SM2.0 games, you had Splitner Cell and Halo, that was kind of it, and both where more DX8 games than DX9 games. By the time the SM2 shader was actually needed it didn't matter the 2nd generation was out.

I remember these days quite well, I had an FX5200 Ultra, my best buddy owned an FX5900 Ultra, while another friend owned a 9800Pro, and my two friends where constantly trying to out do each other with in game performance. Lets remember neither the R300 or the NV30 generation are really that good at SM2.0 games to begin with, both struggle with the first real SM2 games that would come out in 2004.

Reply 59 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

except during the actual time when the card mattered you had no real SM2.0 games, you had Splitner Cell and Halo, that was kind of it, and both where more DX8 games than DX9 games. By the time the SM2 shader was actually needed it didn't matter the 2nd generation was out.

As I already said, this simply isn't true.
GeForce 6x00 didn't arrive until April 2004.
There were various SM2.0 games and benchmarks out before that, most notably Tomb Raider:AOD.
Besides, even though 6x00 was out in April 2004, you don't expect people who just spent $500 on an 5800/5900 to upgrade again, do you? They had to ride it out for 3-4 years, missing out on all the SM2.0-goodness, while people with an R300-card could run all these games just fine.
*That* is why the FX-series has such a poor reputation. Everytime new SM2.0 games came out, and the FX was included in benchmarks, it ended in the lower regions. So people who bought these cards felt very disappointed, and knew they had to upgrade.

candle_86 wrote:

I remember these days quite well, I had an FX5200 Ultra, my best buddy owned an FX5900 Ultra, while another friend owned a 9800Pro, and my two friends where constantly trying to out do each other with in game performance. Lets remember neither the R300 or the NV30 generation are really that good at SM2.0 games to begin with, both struggle with the first real SM2 games that would come out in 2004.

Again, patently false. I had a Radeon 9600XT 256 MB, and it handled any SM2.0 game just fine, with 4xMSAA and 16xAF.
As I said, I only upgraded it to a GeForce 7600 because I moved to a PCI-e based system. If I didn't have to upgrade my CPU, I probably would have stuck with the 9600XT even longer, riding out the whole DX9-era, because the videocard was still performing well enough in games.
I mean, do you even understand what we're talking about here? Even the 9600Pro outperformed the 5900Ultra in SM2.0 games. As shown in Anandtech's article, the 9800Pro was 70% faster than the 5900Ultra when running SM2.0 in HL2.
You really can't lump the NV30 and R300 together like that, because the R300 is massively faster in SM2.0.
In fact, efficiency-wise, there weren't a lot of improvements over R300 in the later cards from ATi and nVidia. The difference was mainly that there were GPUS with more pipelines and more and faster memory. But the midrange cards were still very comparable to the 9700Pro/9800Pro/XT for quite a while. The 9700/9800 had some of the longest lifespans ever in GPU-land. It was the only card you needed for the DX9-era. It's up there with the 8800GTX/Ultra, which were the only ones you'd need for the whole DX10-era.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/