VOGONS


NV3x, R3x0, and pixel shader 2.0

Topic actions

Reply 60 of 103, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

What I found most shocking is that apparently Carmack never bothered to run the game on AMD hardware at all, and just released it like that.

Carmack said complete opposite, he felt AMD betrayed them: http://kotaku.com/5847761/why-was-the-pc-laun … -such-a-cluster
AMD said they accidentally included old OpenGL driver to catalyst prior to the game release. I played Rage on my 4850 with first fix after that, and recall only screen tearing issues.

Scali wrote:

You really can't lump the NV30 and R300 together like that, because the R300 is massively faster in SM2.0.

Everybody and his grandma knows that, but keep beating that dead horse if it makes you feel better.

Reply 61 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
Putas wrote:

AMD said they accidentally included old OpenGL driver to catalyst prior to the game release. I played Rage on my 4850 with first fix after that, and recall only screen tearing issues.

On my 5770 I needed the last of the 4 hotfixes to get it working at all. The others got less than 1 fps, so even navigating through the menu was near-impossible.
Also, that excuse makes no sense. ID's QA should have tested on AMD hardware with publicly released drivers.
Besides, if there were any 'new drivers' already available, just 'accidentally' not released, then why did they have to release 4 hotfixes before they got it right?
Nope, none of that makes any sense whatsoever.

Putas wrote:

Everybody and his grandma knows that, but keep beating that dead horse if it makes you feel better.

You're the one beating a dead horse here. Apparently you still feel a pain in your posterior for purchasing an FX back then, and still feel the urge to validate your choice. There was no reason for you to respond to this. And your response adds nothing to the discussion other than the umpteenth personal attack.
I responded to someone who lumped NV30 and R300 together, so it looks like he DOESN'T know that. I feel completely neutral about the whole thing, as I already said before. Unlike you, I don't have any hangups about brands or even the PC/Windows platform in general.
I find it all to be particularly 'meh' (and oh the irony that only yesterday I was called out as an nVidia shill... the downside of being neutral: you get called out by both sides of fanboys).
I guess you STILL don't get who I am and what makes me tick. I suggest you do some googling and get a clue, before you ever speak to me again.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 62 of 103, by obobskivich

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

except during the actual time when the card mattered you had no real SM2.0 games, you had Splitner Cell and Halo, that was kind of it, and both where more DX8 games than DX9 games. By the time the SM2 shader was actually needed it didn't matter the 2nd generation was out.

I remember these days quite well, I had an FX5200 Ultra, my best buddy owned an FX5900 Ultra, while another friend owned a 9800Pro, and my two friends where constantly trying to out do each other with in game performance. Lets remember neither the R300 or the NV30 generation are really that good at SM2.0 games to begin with, both struggle with the first real SM2 games that would come out in 2004.

Splinter Cell 🤣 😵 For that alone that's a "+1" for GeForce FX. 🤣 FX 5800 Ultra has no problem with Halo on maximum either. So there's your "DX9 games in 2003" handled. Tomb Raider: AoD is worth mentioning if only because reviewers latched onto it (it has a PS2.0 path and it has a built-in benchmark (although according to Core, this benchmark is bugged/should not be relied upon - see the TechReport link)), but I don't think it was that significant in terms of actual gameplay and afaik nothing else uses its engine. I haven't gotten around to trying it with 5800, but benchmarks I've seen show it to be playable/functional via PS2.0 (and afaik that's without the Cg-based enhancements that it offers).

Example benchmarks:
https://techreport.com/review/5990/nvidia-gef … -5900-xt-gpu/10
http://hothardware.com/reviews/msi-geforce-fx-5900-xt?page=3
https://techreport.com/review/6572/nvidia-gef … 800-ultra-gpu/9 (this is a review of 6800 Ultra, but it has 9800XT and 5950 Ultra side-by-side in Splinter Cell)

And while we're reading - most of the games in there absolutely comport with your D3D7/8-as-most-common argument, but that's roughly covering DX9 for early adopters. To jump ahead into '04, GeForce 6 and Radeon X become much more significant, here's a 6600GT review that has some older cards (5900XT, 9600XT, 9800XT) as well:
https://techreport.com/review/7295/nvidia-gef … ics-processor/6

It also includes some newer games, like Counter-Strike Source, and Rome: Total War (which afaik is DX9):
https://techreport.com/review/7295/nvidia-gef … cs-processor/11

Speaking of RTS games, here's another "big deal title" for '03:
http://www.anandtech.com/show/1293/19

I tried to find other non-FPS benchmarks, but that's generally easier said than done. Here's one I did find, which while a decent representation of very popular/top-selling games in the early 2000s (Morrowind, NWN, C&C Generals), doesn't represent PS2.0 titles (again, "D3D7/8-as-most-common"):
http://www.nvnews.net/previews/geforce_fx_595 … ra/page_7.shtml

Reply 63 of 103, by Scali

User metadata
Rank l33t
Rank
l33t

Looking only at reviews of newly released graphics cards ignores the fact that the older cards have been on sale up to then, and even after the release of the newer products, in many cases, and that a lot of people own these cards, even though newer, better cards may be available now.
So it is completely misrepresenting the actual issue.
It's not like every NV3x-owner got a free 6xxx-card once they were introduced!
And it's not like developers can stop developing special paths to ensure good NV3x-performance just because newer cards are available!

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 64 of 103, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

ID's QA should have tested on AMD hardware with publicly released drivers.

That is receipt for endless delays.

Scali wrote:

Apparently you still feel a pain in your posterior for purchasing an FX back then, and still feel the urge to validate your choice.

Apparently you just make stuff up when reality differs from your wishes.

Reply 65 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
Putas wrote:

That is receipt for endless delays.

Why?
OpenGL and D3D are standards, and drivers should support everything your game should ever want to do before you even start writing the game.
In most cases, things should just work. I get the distinct impression that people these days seem to think that every new game NEEDS a driver update. If that's the case, then what's the point of having standards like OpenGL or D3D in the first place?
In the old days this didn't happen at all. Drivers just worked and that's that.
Then vendors started to optimize specifically for specific games, so when games were released, you got these 'game-optimized' driver releases.
But normally games should still work just fine with older drivers.
So there normally is no show-stopping driver issue where a game simply does not work at all. Rage was an exceptional case, which calls for exceptional measures. I don't think releasing the game in its current state was the right decision.
Remember, it was not *just* the AMD drivers that needed patches. The game itself was also patched (even twice if I recall correctly) before they finally got everything working well on all the common nVidia and AMD hardware.
So yes, I certainly think the game should have been delayed. It simply wasn't ready yet. From a QA point-of-view I simply don't understand how anyone can make the decision to release a game where you KNOW it doesn't work on a lot of configurations. What's the point of QA then?

Putas wrote:

Apparently you just make stuff up when reality differs from your wishes.

What exactly am I making up here? Obviously you can't resist the urge to keep responding to NV30-related issues. I'm not making that up, your posts are right there for everyone to see.
And it has been shown that many of your posts distort reality, and even contain personal attacks, so there must be some ulterior motive behind your posts.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 66 of 103, by sliderider

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:
I think another point is that if you're late to the party (as AMD was with DX10), you have the problem that a lot of people have […]
Show full quote
F2bnp wrote:

- The 4x00 series was fantastic. Their value for money was insane, hence they are heralded as an amazing series. I do not know why they didn't make a dent on Nvidia's marketshare, but I would probably attribute most of it to brand loyalty and generally misinformed customers.

I think another point is that if you're late to the party (as AMD was with DX10), you have the problem that a lot of people have already upgraded their cards, and aren't interested in a 4x00-card that is only slightly better than what they already own.
You either have to be the first, or you have to offer a product that is considerably better than the competition, to make people want to upgrade.
Like some of the 'classics' such as the original Voodoo, the GeForce2, the R300 and the GeForce 8800.

How was AMD "late" with DX10? All my sources that I checked tell me that HD2000 was released over 2006-7 while GeForce 8 was released over the course of 2007-8, so AMD was out with DX-10 cards before nVidia.

Reply 67 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
Scali wrote:
As I already said, this simply isn't true. GeForce 6x00 didn't arrive until April 2004. There were various SM2.0 games and bench […]
Show full quote
candle_86 wrote:

except during the actual time when the card mattered you had no real SM2.0 games, you had Splitner Cell and Halo, that was kind of it, and both where more DX8 games than DX9 games. By the time the SM2 shader was actually needed it didn't matter the 2nd generation was out.

As I already said, this simply isn't true.
GeForce 6x00 didn't arrive until April 2004.
There were various SM2.0 games and benchmarks out before that, most notably Tomb Raider:AOD.
Besides, even though 6x00 was out in April 2004, you don't expect people who just spent $500 on an 5800/5900 to upgrade again, do you? They had to ride it out for 3-4 years, missing out on all the SM2.0-goodness, while people with an R300-card could run all these games just fine.
*That* is why the FX-series has such a poor reputation. Everytime new SM2.0 games came out, and the FX was included in benchmarks, it ended in the lower regions. So people who bought these cards felt very disappointed, and knew they had to upgrade.

candle_86 wrote:

I remember these days quite well, I had an FX5200 Ultra, my best buddy owned an FX5900 Ultra, while another friend owned a 9800Pro, and my two friends where constantly trying to out do each other with in game performance. Lets remember neither the R300 or the NV30 generation are really that good at SM2.0 games to begin with, both struggle with the first real SM2 games that would come out in 2004.

Again, patently false. I had a Radeon 9600XT 256 MB, and it handled any SM2.0 game just fine, with 4xMSAA and 16xAF.
As I said, I only upgraded it to a GeForce 7600 because I moved to a PCI-e based system. If I didn't have to upgrade my CPU, I probably would have stuck with the 9600XT even longer, riding out the whole DX9-era, because the videocard was still performing well enough in games.
I mean, do you even understand what we're talking about here? Even the 9600Pro outperformed the 5900Ultra in SM2.0 games. As shown in Anandtech's article, the 9800Pro was 70% faster than the 5900Ultra when running SM2.0 in HL2.
You really can't lump the NV30 and R300 together like that, because the R300 is massively faster in SM2.0.
In fact, efficiency-wise, there weren't a lot of improvements over R300 in the later cards from ATi and nVidia. The difference was mainly that there were GPUS with more pipelines and more and faster memory. But the midrange cards were still very comparable to the 9700Pro/9800Pro/XT for quite a while. The 9700/9800 had some of the longest lifespans ever in GPU-land. It was the only card you needed for the DX9-era. It's up there with the 8800GTX/Ultra, which were the only ones you'd need for the whole DX10-era.

your memory seems to be failing you, I dont mention Tomb Raider because honestly it was the 2nd most forgetable tomb raider, and honestly didn't matter. And the R300 stuff was only good for real DX9 if you where willing to turn down settings and resolution, as for 9600pro, yea I had one myself in 2004 it was ok, but I had to play CSS at medium 10x7 to get playable FPS on my XP 2500 at the time. And by playable I mean playable a consistant 60FPS, my buddy with his 9800pro had to run CSS at 12x10 because at 16x12 it would dip below 60 to often. Same goes for FarCry for both of those cards, along with well every other game.

The best DX9 card that lasted the longest wasn't the R300 series at all, it was actually the 6600GT, people where still using them clear into 2009 but you wouldn't know about that. I don't hate ATI and I don't love Nvidia but your fact are very poor. Lets not also forget SM2 wasn't the biggest deal at the time also. Quake 3 based games, as well as Doom 3 really ruled the game market back then. Your single biggest FPS was call of Duty, which the 5600 Ultra would smoke a 9800XT on. but you dont want to know about OpenGL. No one was disapointed with their FX card, and most people that will drop 500 dollars on a video card will do it again. Most FX 5900 Ultra owners owned a Ti 4600 and before that a Ti 500 and a Geforce 2 Ultra before that. And guess what most of them went out and bought a 6800 Ultra.

Reply 68 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
sliderider wrote:
Scali wrote:
I think another point is that if you're late to the party (as AMD was with DX10), you have the problem that a lot of people have […]
Show full quote
F2bnp wrote:

- The 4x00 series was fantastic. Their value for money was insane, hence they are heralded as an amazing series. I do not know why they didn't make a dent on Nvidia's marketshare, but I would probably attribute most of it to brand loyalty and generally misinformed customers.

I think another point is that if you're late to the party (as AMD was with DX10), you have the problem that a lot of people have already upgraded their cards, and aren't interested in a 4x00-card that is only slightly better than what they already own.
You either have to be the first, or you have to offer a product that is considerably better than the competition, to make people want to upgrade.
Like some of the 'classics' such as the original Voodoo, the GeForce2, the R300 and the GeForce 8800.

How was AMD "late" with DX10? All my sources that I checked tell me that HD2000 was released over 2006-7 while GeForce 8 was released over the course of 2007-8, so AMD was out with DX-10 cards before nVidia.

no 8800GTX released Nov 8 2006, HD 2900XT May 13 2007

Reply 69 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

your memory seems to be failing you, I dont mention Tomb Raider because honestly it was the 2nd most forgetable tomb raider, and honestly didn't matter.

Firstly, how does that imply that my memory is failing me? I take offense at these baseless accusations. Why do you have to insult me instead of just sticking to the topic?
Such fallacies are incredibly poor form, and I want them to stop right now.

Secondly, just because you didn't like the game doesn't mean NV3x didn't struggle with the SM2.0 code in that game.
Another fallacy, a non-sequitur in this case.

candle_86 wrote:

And the R300 stuff was only good for real DX9 if you where willing to turn down settings and resolution

'Your memory seems to be failing you'.
In 2002, I had a 1028x768 monitor. LCD screens and HD resolutions had not reached us yet.
Firstly, turning down resolution was no issue, becuase there was no crappy scaling on CRTs.
Secondly, even my 9600XT could run HL2 at maximum settings with 4xMSAA and 16xAF and reach 30-160 fps throughout the game.

candle_86 wrote:

as for 9600pro, yea I had one myself in 2004 it was ok, but I had to play CSS at medium 10x7 to get playable FPS on my XP 2500 at the time. And by playable I mean playable a consistant 60FPS, my buddy with his 9800pro had to run CSS at 12x10 because at 16x12 it would dip below 60 to often. Same goes for FarCry for both of those cards, along with well every other game.

Again, you are applying 2015 metrics to 2002-era hardware and software.
60 fps wasn't as important back then as it is today, for the simple reason that in those early days of 3d acceleration, most cards or even consoles simply couldn't reach those speeds.

candle_86 wrote:

The best DX9 card that lasted the longest wasn't the R300 series at all, it was actually the 6600GT, people where still using them clear into 2009

You realize that there is no way to support this statement whatsoever.
Even so, I'm quite sure we can find people who were still using R300s into 2007+ as well, making them last longer than the much newer 6600GT. As I said myself, I used my 9600XT until I upgraded to a Core2 Duo, which would have been late 2006 I think, only because I needed PCI-e instead of AGP.
I continued using the 9600XT as a secondary system for some years after that (in fact, I still have it).

candle_86 wrote:

but you wouldn't know about that.

Again, enough with the ad-hominems.

candle_86 wrote:

but your fact are very poor.

I have backed up every single thing I said. I can't say the same for the others.
I mostly receive insults and useless garbage as 'info'.

candle_86 wrote:

Lets not also forget SM2 wasn't the biggest deal at the time also.

Perhaps not for gamers, but I am a developer. I started writing SM2 code as soon as I got my hands on an R300-class GPU, and never looked back. Perhaps that makes all the difference... I wrote this lovely R300-code... Then NV30 came around, I tried my R300-code on it... and it was like... "Erm wait... Are they serious!?". Took a lot of rewriting and hacking around the limitations of NV30 to make it render at acceptable quality and performance, where the R300 had no such limitations, so you could just write elegant SM2.0 the way it was meant.

Last edited by Scali on 2015-07-14, 13:28. Edited 1 time in total.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 70 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

no 8800GTX released Nov 8 2006, HD 2900XT May 13 2007

Indeed, I'm surprised that anyone does NOT know this.
The 8800 series was THE DX10-hardware. Also the hardware that kicked off the GPGPU revolution by introducing Cuda, which later led to OpenCL and DirectCompute (both of which the original 8800 series support).
They also introduced physics acceleration on GPUs with PhysX.
Note also that they were actually launched even before Vista/DX10.
8800, like the R300, is one of the big milestones in the history of GPUs.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 71 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
Scali wrote:
Firstly, how does that imply that my memory is failing me? I take offense at these baseless accusations. Why do you have to insu […]
Show full quote
candle_86 wrote:

your memory seems to be failing you, I dont mention Tomb Raider because honestly it was the 2nd most forgetable tomb raider, and honestly didn't matter.

Firstly, how does that imply that my memory is failing me? I take offense at these baseless accusations. Why do you have to insult me instead of just sticking to the topic?
Such fallacies are incredibly poor form, and I want them to stop right now.

Secondly, just because you didn't like the game doesn't mean NV3x didn't struggle with the SM2.0 code in that game.
Another fallacy, a non-sequitur in this case.

candle_86 wrote:

And the R300 stuff was only good for real DX9 if you where willing to turn down settings and resolution

'Your memory seems to be failing you'.
In 2002, I had a 1028x768 monitor. LCD screens and HD resolutions had not reached us yet.
Firstly, turning down resolution was no issue, becuase there was no crappy scaling on CRTs.
Secondly, even my 9600XT could run HL2 at maximum settings with 4xMSAA and 16xAF and reach 30-160 fps throughout the game.

candle_86 wrote:

as for 9600pro, yea I had one myself in 2004 it was ok, but I had to play CSS at medium 10x7 to get playable FPS on my XP 2500 at the time. And by playable I mean playable a consistant 60FPS, my buddy with his 9800pro had to run CSS at 12x10 because at 16x12 it would dip below 60 to often. Same goes for FarCry for both of those cards, along with well every other game.

Again, you are applying 2015 metrics to 2002-era hardware and software.
60 fps wasn't as important back then as it is today, for the simple reason that in those early days of 3d acceleration, most cards or even consoles simply couldn't reach those speeds.

candle_86 wrote:

The best DX9 card that lasted the longest wasn't the R300 series at all, it was actually the 6600GT, people where still using them clear into 2009

You realize that there is no way to support this statement whatsoever.
Even so, I'm quite sure we can find people who were still using R300s into 2007+ as well, making them last longer than the much newer 6600GT. As I said myself, I used my 9600XT until I upgraded to a Core2 Duo, which would have been late 2006 I think, only because I needed PCI-e instead of AGP.
I continued using the 9600XT as a secondary system for some years after that (in fact, I still have it).

candle_86 wrote:

but you wouldn't know about that.

Again, enough with the ad-hominems.

candle_86 wrote:

but your fact are very poor.

I have backed up every single thing I said. I can't say the same for the others.
I mostly receive insults and useless garbage as 'info'.

candle_86 wrote:

Lets not also forget SM2 wasn't the biggest deal at the time also.

Perhaps not for gamers, but I am a developer. I started writing SM2 code as soon as I got my hands on an R300-class GPU, and never looked back. Perhaps that makes all the difference... I wrote this lovely R300-code... Then NV30 came around, I tried my R300-code on it... and it was like... "Erm wait... Are they serious!?". Took a lot of rewriting and hacking around the limitations of NV30 to make it render at acceptable quality and performance, where the R300 had no such limitations, so you could just write elegant SM2.0 the way it was meant.

no im not applying 2015 metrics to my early 2000's hardware, I required the best possible FPS for my gaming with the money I had, I played in tournments and did alot of lan parties, and yes even back in 2002 people wanted 60FPS, because it makes a diffrence. If my game ever went to 30FPS it would get turned down big time, I play COD2 in DX7 mode on a 6800GT because it kept dipping below 60FPS during firefights. And while turning down resolution wasn't as bad during CRT days, it still wasn't optimal, as a sniper I wanted higher resolution so I could see more details on those im shooting at to get a better shot.

Reply 72 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

no im not applying 2015 metrics to my early 2000's hardware, I required the best possible FPS for my gaming with the money I had, I played in tournments and did alot of lan parties, and yes even back in 2002 people wanted 60FPS, because it makes a diffrence. If my game ever went to 30FPS it would get turned down big time, I play COD2 in DX7 mode on a 6800GT because it kept dipping below 60FPS during firefights. And while turning down resolution wasn't as bad during CRT days, it still wasn't optimal, as a sniper I wanted higher resolution so I could see more details on those im shooting at to get a better shot.

So basically you're saying you don't care about graphics quality at all, and just turn everything down to get the highest possible framerates.
Which is pretty much completely beside the point of this whole topic.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 73 of 103, by Gamecollector

User metadata
Rank Oldbie
Rank
Oldbie

By the way, NWN1 just wants some GL_NV_ extensions. They are obsolete long ago and are replaced by GL_EXT_ or GL_ARB_ versions. But the game still wants the old specification.
It can be fixed by simple aliasing. But ATI/AMD not wrote, not write and will not write the good drivers. By design.
Same for NVidia demos. They are GF only because of the heavy GL_NV_ extensions usage.

Asus P4P800 SE/Pentium4 3.2E/2 Gb DDR400B,
Radeon HD3850 Agp (Sapphire), Catalyst 14.4 (XpProSp3).
Voodoo2 12 MB SLI, Win2k drivers 1.02.00 (XpProSp3).

Reply 74 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
Scali wrote:
candle_86 wrote:

no im not applying 2015 metrics to my early 2000's hardware, I required the best possible FPS for my gaming with the money I had, I played in tournments and did alot of lan parties, and yes even back in 2002 people wanted 60FPS, because it makes a diffrence. If my game ever went to 30FPS it would get turned down big time, I play COD2 in DX7 mode on a 6800GT because it kept dipping below 60FPS during firefights. And while turning down resolution wasn't as bad during CRT days, it still wasn't optimal, as a sniper I wanted higher resolution so I could see more details on those im shooting at to get a better shot.

So basically you're saying you don't care about graphics quality at all, and just turn everything down to get the highest possible framerates.
Which is pretty much completely beside the point of this whole topic.

no if i can maintain 60FPS and still get good graphics I will, Its why I bought an XFX 6600GT AGP for 220 when they came out, for CSS. I could then run 12x10 High and maintain between 60 and 120fps at all times. Sadly FarCry remained unplayable above medium settings until I finally bought my 8800GTS 320.

Reply 75 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

no if i can maintain 60FPS and still get good graphics I will, Its why I bought an XFX 6600GT AGP for 220 when they came out, for CSS. I could then run 12x10 High and maintain between 60 and 120fps at all times. Sadly FarCry remained unplayable above medium settings until I finally bought my 8800GTS 320.

But then you're missing the point... No card could sustain 60 fps on max settings back in those days, as I said.
The big difference between the R300 and the NV30 however is that R300 ran SM2.0 almost as fast as SM1.x or fixedfunction code (and internally it ran everything as 24-bit SM2.0 code, as I said, because there's no other pipelines in the GPU), where NV30 was competitive with R300 with fixedfunction or SM1.x, but took a huge hit with SM2.0.
Sure, if you only plan to run fixedfunction code on it anyway, you won't notice that hit, but that's rather missing the point of this thread.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 76 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:

Also, because PhysX came up (how this relates to NV3x I haven't the foggiest): GeForce 8 did not introduce PhysX-on-GPU; Ageia wasn't even part of nVidia until 2008. PhysX-on-GPU was launched on the GTX 280/260 and 9800GTX/GTX+ (w/driver 177.39), and later extended to include GeForce 8 and other cards (starting with 177.79) later in the year.

The 8800 was marketed as having physics capabilities though, and some demos demonstrated this at launch. And although it didn't have it at launch, nVidia did eventually deliver on their promise with PhysX. So the 8800 is the oldest GPU capable of PhysX.
ATi has also promised GPU physics, going back as far as the X1xxx-series, but so far they have not delivered on this promise.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 79 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
Scali wrote:
But then you're missing the point... No card could sustain 60 fps on max settings back in those days, as I said. The big differe […]
Show full quote
candle_86 wrote:

no if i can maintain 60FPS and still get good graphics I will, Its why I bought an XFX 6600GT AGP for 220 when they came out, for CSS. I could then run 12x10 High and maintain between 60 and 120fps at all times. Sadly FarCry remained unplayable above medium settings until I finally bought my 8800GTS 320.

But then you're missing the point... No card could sustain 60 fps on max settings back in those days, as I said.
The big difference between the R300 and the NV30 however is that R300 ran SM2.0 almost as fast as SM1.x or fixedfunction code (and internally it ran everything as 24-bit SM2.0 code, as I said, because there's no other pipelines in the GPU), where NV30 was competitive with R300 with fixedfunction or SM1.x, but took a huge hit with SM2.0.
Sure, if you only plan to run fixedfunction code on it anyway, you won't notice that hit, but that's rather missing the point of this thread.

But your missing the bigger point, SM 2 didnt matter for the first gen