VOGONS


NV3x, R3x0, and pixel shader 2.0

Topic actions

Reply 80 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

But your missing the bigger point, SM 2 didnt matter for the first gen

I'm not missing the point, you simply don't agree with me.
Then again, as I say, I'm a developer. I wrote SM2.0 code as soon as I got my hands on an R300-card. R300 is pretty much the reason why we have SM2.0 games. Most DX9-engines were developed on these cards. They were the first, and they were the standard for years. For the first 2-3 years, R300 was the only reasonable option for developing DX9-engines/games, until finally the GeForce 6-series offered an alternative. But technically, we were moving into SM3.0 territory by then. So if you want to look at it that way, nVidia never had a decent SM2.0 card at all, and the R300 was the only option.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 81 of 103, by Skyscraper

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

But your missing the bigger point, SM 2 didnt matter for the first gen

This is just not true, normal people do not buy new computers every year not even normal gamers. Even back in 2003-2004 people expected their new computers to last 2-3 years before any upgrades are needed.

I used to build computers as a side job, longevity was a main concern when choosing parts, just as important as price.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 82 of 103, by leileilol

User metadata
Rank l33t++
Rank
l33t++

"normal people"

In the SM 2.0 era, those "normal people" you refer to would either get something with only Intel Extreme 82815 or so, or if they're very lucky, a Geforce4MX which obviously has no shader support at all. This is why HL2 has a DX7 rendering path, as if you expect to have sales, you'd need to support this majority of the "normal people".

apsosig.png
long live PCem

Reply 83 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

"normal people"

In the SM 2.0 era, those "normal people" you refer to would either get something with only Intel Extreme 82815 or so, or if they're very lucky, a Geforce4MX which obviously has no shader support at all. This is why HL2 has a DX7 rendering path, as if you expect to have sales, you'd need to support this majority of the "normal people".

Thank you, and talking to a local shop around me, during this time period the most popular systems he sold had Durons 1200-1600's or Athlon XP's up to about 2400's with either onboard VIA video or Geforce 4 MX/Geforce FX 5200/Radeon 9000.

Reply 84 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:
candle_86 wrote:

But your missing the bigger point, SM 2 didnt matter for the first gen

This is just not true, normal people do not buy new computers every year not even normal gamers. Even back in 2003-2004 people expected their new computers to last 2-3 years before any upgrades are needed.

I used to build computers as a side job, longevity was a main concern when choosing parts, just as important as price.

people buying at the bleeding edge usually will upgrade regularly, about once a year for at least their video card. I've watched this for years, on forums and with friends that liked bleeding edge. One is a perfect example.

Bought a Brand new FX 5800 Ultra on release, he then bought an FX 5950 Ultra, followed that with a 6800 Ultra, then a 7800GTX, then bought an x1900XTX, followed by an 8800GTX, to a 9800GX2, which he then bought a GTX 295, to a GTX 580, to a GTX 680, then bought a Titan, then a Titan Black, and just recently he bought a Titan X. Thats your bleeding edge personality right their.

As for the rest of us, considering the 9500/9600 where rather slow in DX9 titles with DX9 features actually enabled most ran in SM 1 code which at the SM1 code path the FX5600/FX5700 more than handled them. This can be viewed in most other games outside of HL2 dropping down to the SM 1 or SM 1.1 path, such as Medal of Honor Pacfic Assault, FarCry, Fear, COD 2 (drops to DX7 path automatically on these cards).

So yes they ran games until 2006ish, but not on the DX9 code path, they where to slow for acceptable performance with SM2 code.

Reply 85 of 103, by Scali

User metadata
Rank l33t
Rank
l33t

Well, the problem is, R300/NV30 aren't just 'bleeding edge' enthusiast products.
They had a wide range of products, from low-end to enthusiast.
Sure, there are people who spend thousands of dollars per year on their machines to keep up with the latest technology.
But there are far more people who choose the somewhat lower-end models (eg regular 9700/9800 or 9500/9600 over the Pro/XT models), and who want to use that card for a few years before upgrading again.

I am one of these people. I generally only upgrade hardware for new features (ideally only once for each new major DX version), so I skip a few generations everytime.
I think that is far more representative for the average NV3x/R3x0 customer than these enthusiasts that upgrade for every little thing.

As for the rest of us, considering the 9500/9600 where rather slow in DX9 titles with DX9 features actually enabled most ran in SM 1 code which at the SM1 code path the FX5600/FX5700 more than handled them

And this is the thing I've been having a problem with ever since the discussion started.
This is simply not true!
I can only conclude that you've solely used NV3x-hardware in your life, and never actually had any R3x0 hardware, and you are just projecting your NV3x-experiences on R3x0-hardware.
Because if you did, you knew that the R3x0 doesn't really make a difference between SM1.x and SM2.0 code.
After all, as I said, they only have the FP24 pipeline, and all shaders are run through that. So unlike NV3x, there's no performance gain from running SM1.x code.
In fact, it is often slower, because SM1.x has other limitations, requiring multiple renderpasses, where SM2.0 can do it in one.
I have never EVER run SM1.x on my 9600XT when SM2.0 was available. There simply was no point.

I don't know why you are even in this discussion in the first place. I'm quite sure plenty of actual R3x0-owners can chime in and support what I say about SM1.x and SM2.0 performance on R3x0.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 86 of 103, by candle_86

User metadata
Rank l33t
Rank
l33t

I've owned many r300 based gpu's before. The fact is SM 1 may not need less than SM2 on r300, DX8 doesnt support the same quality effects as DX9, meaning the DX8 code path takes less work even when render speed is the same.

Reply 87 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

I've owned many r300 based gpu's before. The fact is SM 1 may not need less than SM2 on r300, DX8 doesnt support the same quality effects as DX9, meaning the DX8 code path takes less work even when render speed is the same.

That is not necessarily true. Eg, the DX81-path in HL2 is lower quality than the DX9-path, but still about as computationally expensive, at least on R300 (go ahead, run benchmarks if you have the hardware).
I also don't really know what you mean by 'takes less work' and 'render speed is the same'. If it takes less work, the render speed should go up, since less work means less time is spent. The difference is mainly in the fact that SM2.0 can do the same things more efficiently, as I already said (more textures available in a single pass, larger instructionset, much higher instructioncount per shader allowed etc).

The fact is also that R300 was the benchmark when developing SM2-games, so the SM2-codepath is tuned to run properly on R300.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 88 of 103, by Skyscraper

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

"normal people"

In the SM 2.0 era, those "normal people" you refer to would either get something with only Intel Extreme 82815 or so, or if they're very lucky, a Geforce4MX which obviously has no shader support at all. This is why HL2 has a DX7 rendering path, as if you expect to have sales, you'd need to support this majority of the "normal people".

I kind of meant normal as in not extreme gamers upgrading every year but not your avarage grandparent buying an OEM either. I think perhaps there are big cultural differences as doing some research, reading reviews and trying to find out what is the better product is actually a common thing to do here regardless if its a car, a computer or some other expensive product you are planning to buy. In Sweden the Radeon 9500 Pro first then the 9600 Pro were the big sellers 2003 - 2004. If some of you think the FX series was a really good buy back in 2003 -2004 thats fine with me but I dont.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 90 of 103, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

Frankly I think a lot of people bought FX cards because of vendor loyalty and not much else.

Fanboys will always buy their brand no matter how bad it is compared to the competition. GeForce FX and GeForce 6 both suffered when compared to R300 and R400, especially at higher resolutions with AA/AF cranked up to the max. GeForce 6 made major strides over GeForce FX in that area, but in many games it still lagged behind R400 with all settings maxed out.

I jumped from GeForce 4MX to Radeon 9500 and was lucky enough to get a card that unlocked to 8 pipes so I was able to keep using it longer than I would have if it was stuck at 4.

Last edited by sliderider on 2015-07-15, 16:26. Edited 1 time in total.

Reply 91 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:

GeForce 6 made major strides over GeForce FX in that area, but in many games it still lagged behind R400 with all settings maxed out.

I thought NV40 and R400 were on fairly equal footing. Sure there were games where one side would win but there wasn't anything really extreme as with the previous gen.

Reply 92 of 103, by Skyscraper

User metadata
Rank l33t
Rank
l33t
sliderider wrote:
swaaye wrote:

Frankly I think a lot of people bought FX cards because of vendor loyalty and not much else.

Fanboys will always buy their brand no matter how bad it is compared to the competition. GeForce FX and GeForce 6 both suffered when compared to R300 and R400, especially at higher resolutions with AA/AF cranked up to the max. GeForce 6 made major strides over GeForce FX in that area, but in many games it still lagged behind R400 with all settings maxed out.

I kind of liked the Geforce 6 series and bought a 6800GT, it was good in the World of Warcraft BETA which was the new hot thing I was playing at the time.

In the Doom 3 timedemo (1024*768 Ultra) the Geforce 6800 Ultra and the Radeon X850XTPE is dead even.

Last edited by Skyscraper on 2015-07-15, 16:28. Edited 1 time in total.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 93 of 103, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:
sliderider wrote:

GeForce 6 made major strides over GeForce FX in that area, but in many games it still lagged behind R400 with all settings maxed out.

I thought NV40 and R400 were on fairly equal footing. Sure there were games where one side would win but there wasn't anything really extreme.

They can be equal with the settings turned down or on smaller monitors but R400 was still better with high levels of AA/AF on big monitors.

I have a GeForce 6800 Ultra Extreme and it still lags my x850 XT PE by about 5-10% even with the factory overclock.

Reply 94 of 103, by Scali

User metadata
Rank l33t
Rank
l33t

What I recall of the 6800-series was that there were cheap versions that were a full 6800Ultra chip with half the pipelines disabled. You could enable these pipelines with software, and if you were lucky, many of them worked well enough to use, giving you great bang for the buck.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 95 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:

In the Doom 3 timedemo (1024*768 Ultra) the Geforce 6800 Ultra and the Radeon X850XTPE is dead even.

I have to say, this is a bit of a pro-nVidia situation though.
Firstly because nVidia has always had better OpenGL driver performance than ATi/AMD, where this difference is less pronounced in D3D.
Secondly, Doom3 has nVidia-specific optimizations, such as UltraShadow, which are not available on ATi hardware.
I know no other games that uses the z-scissor in UltraShadow than the ones based on the Doom3 engine. The feature is not even exposed through the D3D API.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 96 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I think Riddick also uses the Ultrashadow capability. And the other idTech4 games of course.

I was digging around for info about it about a month ago. Apparently GCN supports the depth bounds test GL extension.

Reply 97 of 103, by obobskivich

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

I thought NV40 and R400 were on fairly equal footing. Sure there were games where one side would win but there wasn't anything really extreme as with the previous gen.

NV40 and R420 sure - the 6800 Ultra and X800Pro/XT are pretty comparable. X800XT Platinum and X850 are another story altogether though; in some situations the X850 is on more level footing with the early 7800s (SM3.0 support aside). AFAIK the primary gains from the X850s are a direct result of their higher clockspeeds.

Here's a review of the XTP from TechReport:
https://techreport.com/review/7679/ati-radeon … raphics-cards/6

Scroll for 1600x1200 and AA/AF.

Repeats in FarCry:
https://techreport.com/review/7679/ati-radeon … raphics-cards/8

Newer review with 7800GT and CrossFire:
https://techreport.com/review/8826/ati-crossf … hics-solution/7

X850XTP (at least mine) is also quieter than 6800 Ultra while under load, and supports transparency and temporal AA modes. Kind of a bummer that my CF-supporting motherboards both seem to be dead. 😢 😵

Reply 98 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah I am aware of how 6800 Ultra and X800/X850 stack up. You can go back a page in that review and see how Doom3 stacks up on R4x0 too. Probably applies for Prey, Quake4, and ETQW as well.

Another thing to think about with R4x0 vs. NV4x is the SM2 vs. SM3 perspective. I played Oblivion on a X850XT and it is missing a number of effects. This was a problem that 6800 didn't have. Bioshock actually caused a little uprising when it wouldn't run on Xxxx cards. 😉

Reply 99 of 103, by obobskivich

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Yeah I am aware of how 6800 Ultra and X800/X850 stack up.

Another thing to think about with R4x0 vs. NV4x is the SM2 vs. SM3 perspective. I played Oblivion on a X850XT and it is missing a number of effects. This was a problem that 6800 didn't have. Bioshock actually caused a little uprising when it wouldn't run on Xxxx cards. 😉

Aye to SM3. It knocks a lot of Unreal 3 based games off the table for X850 as well. Certainly a card that "lived fast and died young." 🤣 Curiosity: Oblivion actually has SM3.0 effects?