VOGONS


NV3x, R3x0, and pixel shader 2.0

Topic actions

Reply 20 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
Putas wrote:
Scali wrote:

Putas was then cherry-picking with the 5900XT.

Just say straight you made a typo, nobody is gonna put you down for that.

I didn't make a typo. I referred to the Anandtech article, which compared the 5900 *Ultra* to R300 cards.
You were the one who brought up 5900XT, of which I do not see the relevance. The conversation was about the 5900 Ultra.

Putas wrote:

Feel free to show when was I ever in denial of NV3x drawbacks. I will wait until you come to your senses.

You're the one who started this whole argument over HL2, NV3x etc. Which means you're in denial, otherwise you wouldn't have responded with all these excuses for NV3x's poor performance.

Eg:

Putas wrote:
HL2 is controversial example exactly due to their treatment of FX cards, forcing the 8.1 path was never fully explained. Remembe […]
Show full quote
Scali wrote:

SM2.0 was adopted very quickly by games, and NV30 was hurt very hard by this. Most popular example was Half-Life 2 which defaulted to the DX8.1 path on NV30 hardware, while it had no trouble running the full SM2.0 path on R300-based cards.

HL2 is controversial example exactly due to their treatment of FX cards, forcing the 8.1 path was never fully explained. Remember they had working path optimized for FX shown to public like month or two before release. Adoption of PS 2.0 was pretty slow, number of games that would tank on FX during its life time was small.
9700 release: August 2003
FX release: January 2004
HL2 release: November 2004

Denial!

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 21 of 103, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Even back then forcing HL2 to use DX9 on FX was........ horrible. First hand experience here. Had R300 and never used a new nvidia card since

(also i'm certain the 9700 came in August 2002 and the higher end desperate catch-up GFFXs came in 2003.)

apsosig.png
long live PCem

Reply 22 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

(also i'm certain the 9700 came in August 2002 and the higher end "catch up" GFFXs in 2003.)

Yup:
http://www.anandtech.com/show/947 <- July 2002
http://www.anandtech.com/show/1062 <-- January 2003

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 23 of 103, by Gamecollector

User metadata
Rank Oldbie
Rank
Oldbie

By the way, can any card w/o FP32 (R300, R400 etc) be declared as the fully PS2.0 compatible? IIRC MS specification wants FP32 for the "full" mode...

Asus P4P800 SE/Pentium4 3.2E/2 Gb DDR400B,
Radeon HD3850 Agp (Sapphire), Catalyst 14.4 (XpProSp3).
Voodoo2 12 MB SLI, Win2k drivers 1.02.00 (XpProSp3).

Reply 24 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
Gamecollector wrote:

By the way, can any card w/o FP32 (R300, R400 etc) be declared as the fully PS2.0 compatible? IIRC MS specification wants FP32 for the "full" mode...

No, the DX9 specs for SM2.0 were indeed 24-bit.
They upgraded it to 32-bit for SM3.0.
See here:
https://msdn.microsoft.com/en-us/librar ... s.85).aspx

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 25 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I recall the 5900XT being perceived as a pretty solid value at the time, probably the best for the whole FX lineup. It's just a 5900 Ultra with lower clock speeds.

The games of the time were still mostly a mix of D3D7/8 and the FX cards are fine with these. It wasn't overly apparent for awhile that the FX cards were really terrible at PS2.0. Some developers did tailor their games for the FX cards too. Far Cry and Doom3 run pretty well.

If you liked Bioware OpenGL games, you didn't really want to be on ATI back then. That's another thing to consider when it comes to perceptions here. Getting KOTOR or NWN working perfectly on ATI was a trick.

Reply 26 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

I recall the 5900XT being perceived as a pretty solid value at the time, probably the best for the whole FX lineup. It's just a 5900 Ultra with lower clock speeds.

Perhaps, but it was partly nVidia trying to save what they could, and partly because their drivers were full of cheats by now, so most games with performance issues were 'fixed'.

swaaye wrote:

The games of the time were still mostly a mix of D3D7/8 and the FX cards are fine with these. It wasn't overly apparent for awhile that the FX cards were really terrible at PS2.0. Some developers did tailor their games for the FX cards too. Far Cry and Doom3 run pretty well.

Far Cry started out as an nVidia tech demo, so obviously it ran fine on nVidia hardware 😀
https://en.wikipedia.org/wiki/Far_Cry_(video_game)

Crytek developed a new game engine called CryEngine for Far Cry. Reportedly, the game was born out of a technology demo called X-Isle: Dinosaur Island made by Crytek to showcase the capabilities of the Nvidia GeForce 3.

Apparently it's still online: http://www.nvidia.com/object/dinoisle.html
Doom3 had a special path for NV30 hardware, and also used the UltraShadow extensions, for which ATi had no alternative.

The first software that showed problems with NV30 as I said is 3DMark03. People were shocked, in disbelief, to say the least. They just couldn't understand why NV30 was that much slower, and accused FutureMark of being bribed by ATi and whatnot.
But in actuality they just showed NV30 for what it is: a card that is good at running DX7/8 stuff, and terrible at DX9.
I've linked to nVidia's cheating antics earlier. I think that is possibly the worst case of cheating in any benchmark ever. It shows just how desperate nVidia was at that point.

Last edited by Scali on 2015-07-13, 20:26. Edited 2 times in total.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 27 of 103, by F2bnp

User metadata
Rank l33t
Rank
l33t
Scali wrote:
Putas wrote:
Scali wrote:

Putas was then cherry-picking with the 5900XT.

Just say straight you made a typo, nobody is gonna put you down for that.

I didn't make a typo. I referred to the Anandtech article, which compared the 5900 *Ultra* to R300 cards.
You were the one who brought up 5900XT, of which I do not see the relevance. The conversation was about the 5900 Ultra.

To be fair, you are the one that brought up the 5900XT getting its ass kicked by an 9600 Pro and how the 9600 Pro was a much cheaper card. In reality, the 5900XT wasn't anywhere near the Ultra in terms of pricing.

The problem here is that not one of us is disagreeing that the NV30/NV35 cards were dogs when it came to DX9. We're just telling you that it's kind of irrelevant in the end, since the R300 cards weren't exactly top performers anyway and by the time DX9 became relevant, the NV40 and X cards were available for quite some time and the next series were already underway (Nvidia 7 series and ATi X1xxx series). We can sit here all day making fun of the NV30 and I honestly find it quite hilarious how Nvidia took such a huge hit by ATi back then and later tried to fool people with all those driver cheats in 3DMark.

Like I said, no one is disagreeing with the technical side of things. We're just saying that HL2 is not enough, it's an extreme case of NV30 sucking hard. I'm not saying it doesn't matter, it's actually a great example of what you're saying. I don't care much about 3DMark, although it's important to note that Nvidia was cheating in order to change public perception on their cards. I should take the blame for derailing the initial thread, as this all started with my post about DX12 and how irrelevant certain features might be at first. I compared it to the DX9 days, because I felt that in order to enjoy DX9 games you had to move away from NV30 and R300. At the end of the day what matters is how most games performed at the time, which is why I brought up the argument to begin with. AMD may be missing features on their current cards, but it probably won't make much of a difference, as by the time these are relevant, 900 series cards may also be unimportant. This is just a hypothesis, things have changed dramatically over the years, GPU development has slowed down and I know of a lot of people still running their HD5770 cards and playing most games enjoyably. On a mid-range 6 year old card no less! So yes, my argument may be utterly false, we'll see 🤣 .

Scali, chill out. This is just a misunderstanding, there's no reason to get so worked up over these things anyway 😊 .

Reply 28 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:
Doom3 had a special path for NV30 hardware, and also used the UltraShadow extensions, for which ATi had no alternative. […]
Show full quote

Doom3 had a special path for NV30 hardware, and also used the UltraShadow extensions, for which ATi had no alternative.

The first software that showed problems with NV30 as I said is 3DMark03. People were shocked, in disbelief, to say the least. They just couldn't understand why NV30 was that much slower, and accused FutureMark of being bribed by ATi and whatnot.
But in actuality they just showed NV30 for what it is: a card that is good at running DX7/8 stuff, and terrible at DX9.
I've linked to nVidia's cheating antics earlier. I think that is possibly the worst case of cheating in any benchmark ever. It shows just how desperate nVidia was at that point.

There was initially a Doom3 NV30 path but it uses the ARB2 path in the final release. It's probably been tweaked by NV in the drivers. I remember John Carmack initially said NV30 was looking too slow for ARB2 but then later he said NV had been able to improve performance.

Yeah the 3DMark03 scandal was ugly and fun to watch.

Far Cry has lots of tech in it and was patched a bunch of times too. I think it has NV30 and R4x0 optimizations. SM3.0 HDR was added. But with NV3x I think it mostly runs PS1.4. The console log reads it out.

Reply 29 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
F2bnp wrote:

To be fair, you are the one that brought up the 5900XT getting its ass kicked by an 9600 Pro and how the 9600 Pro was a much cheaper card.

Where exactly did I do that? Because I don't see me saying anything about '5900XT' before Putas brought it up.

F2bnp wrote:

The problem here is that not one of us is disagreeing that the NV30/NV35 cards were dogs when it came to DX9. We're just telling you that it's kind of irrelevant in the end, since the R300 cards weren't exactly top performers anyway and by the time DX9 became relevant, the NV40 and X cards were available for quite some time and the next series were already underway (Nvidia 7 series and ATi X1xxx series).

That's funny, because many of the X cards were just PCI-e variations of R(V)3x0 chips.
ATi just built on the architecture that started wtih the 9700 for years, and it was very successful.
And I also disagree. 9700Pro remained relevant for quite a while, because it was such an excellent and efficient card. It was the first card to be able to do MSAA+AF efficiently.

F2bnp wrote:

Scali, chill out. This is just a misunderstanding, there's no reason to get so worked up over these things anyway 😊 .

This is *not* just a misunderstanding. I am being accused of being an 'nVidia shill'. I do not take these accusations lightly.
It's bad enough that I have to hear that nonsense from clueless rabid AMD fanboys who can't accept that AMD dropped the ball with DX12 on regulr tech forums. But I would expect something different here, on a vintage/retro forum, where people are supposed to be somewhat more tech-savvy in general, and interested in technology, rather than what brand name is on the box.
I simply have 0 tolerance for these people, and I will make sure that the message comes through.

I mean, come on... You guys know me, right? I code 8088+CGA, Amiga, C64, PowerVR, GP2X etc as a hobby. Do you REALLY think I would care about Intel vs AMD vs nVidia or whatever on anything but the *technical* level? I live and breathe graphics technology. I've been around for too long, and used too many brands of hardware over the years to have any single brand to be a fan of.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 30 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++

F2BNP's calling R300 and NV30 irrelevant is probably coming from the fact that we would rather use a much faster card for such games now. I ran a 9700 from 2003-2006, but if I want to play those games now, I don't look for a 9700 or 5900 at this point. Unless I am just looking to experiment with old hardware. That's what I think the point is there.

I don't think he thinks R300 was irrelevant at the time, but in the long term does it really matter anymore? There were how many D3D9 cards? Did they really ever stop coming? Are we really away from D3D9 today?! Heh

Reply 31 of 103, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

You were the one who brought up 5900XT, of which I do not see the relevance.

People can read, you know.

Scali wrote:
You're the one who started this whole argument over HL2, NV3x etc. Which means you're in denial, otherwise you wouldn't have res […]
Show full quote
Putas wrote:

Feel free to show when was I ever in denial of NV3x drawbacks. I will wait until you come to your senses.

You're the one who started this whole argument over HL2, NV3x etc. Which means you're in denial, otherwise you wouldn't have responded with all these excuses for NV3x's poor performance.

Eg:

Putas wrote:
HL2 is controversial example exactly due to their treatment of FX cards, forcing the 8.1 path was never fully explained. Remembe […]
Show full quote
Scali wrote:

SM2.0 was adopted very quickly by games, and NV30 was hurt very hard by this. Most popular example was Half-Life 2 which defaulted to the DX8.1 path on NV30 hardware, while it had no trouble running the full SM2.0 path on R300-based cards.

HL2 is controversial example exactly due to their treatment of FX cards, forcing the 8.1 path was never fully explained. Remember they had working path optimized for FX shown to public like month or two before release. Adoption of PS 2.0 was pretty slow, number of games that would tank on FX during its life time was small.
9700 release: August 2003
FX release: January 2004
HL2 release: November 2004

Denial!

I am in denial of PS2.0 being important in the lifetime of GeForce FX, yes. How can you interpret this as excuse for NV3x's poor performance without trolling intention is beyond me.

leileilol wrote:

(also i'm certain the 9700 came in August 2002 and the higher end "catch up" GFFXs in 2003.)

Ouch, my bad. So it took actually two years until the pixel shading disparity became that dramatic in a high profile game.

Reply 32 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
Putas wrote:
Scali wrote:

You were the one who brought up 5900XT, of which I do not see the relevance.

People can read, you know.

Apparently people cannot produce a link to whatever it is you are claiming though.
I didn't name any specific card, I just referred to 'NV30', and linked to Anandtech.
You were the one to name 5900XT specifically, to which I responded that Anandtech tested the 5900Ultra.

Putas wrote:

Denial of SM2.0 being important in the lifetime of GeForce FX. How can you interpret this as excuse for NV3x's poor performance without trolling intention is beyond me.

Very simple. You are downplaying the big weakness of GeForce FX.
Also, I believe Tomb Raider: Angel of Darkness was the first big SM2.0 game released, which was in 2003: https://en.wikipedia.org/wiki/Tomb_Raider:_Th … gel_of_Darkness
That puts it slap-bang into the lifetime of GeForce FX.
And look what happened when people started using it as a benchmark: http://forums.anandtech.com/showthread.php?t=1152044

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 33 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Scali I think you feel more strongly about pointing how awful NV3x is than the rest of us really care about anymore. That might be the real issue here. 😀 We all know about how awful it is for PS2.0 games.

On the 5900XT aspect, I think there was a point in the thread where you seemed to relate 5900XT to the high-end (like 9800XT). I thought it was just a little mix up.

Reply 34 of 103, by leileilol

User metadata
Rank l33t++
Rank
l33t++

I'm more surprised FX delusionists were still a thing in 2015. Reminds me a bit of the Dreamcast scene and the whole belief that the PVR250 is some sort of superchip that never got "properly" used.

apsosig.png
long live PCem

Reply 35 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Scali I think you feel more strongly about pointing how awful NV3x is than the rest of us really care about anymore. That might be the real issue here. 😀

I don't, I just keep getting attacked by this Putas-clown, spreading his misinformation and insulting me.

swaaye wrote:

On the 5900XT aspect, I think there was a point in the thread where you seemed to relate 5900XT to the high-end (like 9800XT). I thought it was just a little mix up.

Well whatever, I didn't see it. As I say, I linked to Anandtech, didn't mention 5900 specifically, just 'NV30', and Putas replied to that post with 5900FX. So in no way was I the one who was comparing 5900FX to R300.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 36 of 103, by F2bnp

User metadata
Rank l33t
Rank
l33t
Scali wrote:

Where exactly did I do that? Because I don't see me saying anything about '5900XT' before Putas brought it up.

Excuse me, but you're in the wrong here.

Scali wrote:
Putas wrote:

No, they had PS 2.0 utilizing half precision and dropped it. I found it very suspicious, it being slower then ATI's path is not good enough justification.

Did you look at the charts? It's not just 'slower', it's about half the speed of ATi cards. A GeForce 5900XT gets beaten by a Radeon 9600Pro. It was a massacre. No customer would accept their expensive 5900XT being beaten by a cheap 9600Pro. But that's just how badly NV3x sucked. nVidia did a great job of covering it up in games, but as a developer I know how bad the card REALLY is.

expensive 5900XT, cheap 9600 Pro

Scali wrote:
Wow, just wow... http://www.anandtech.com/show/1144 […]
Show full quote
Putas wrote:

I remember it all to well, and 5900 XT was not expensive at all.

Wow, just wow...
http://www.anandtech.com/show/1144

Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both o […]
Show full quote

Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag.
...
In our first test, we see that ATI holds an incredible lead over NVIDIA, with the Radeon 9800 Pro outscoring the GeForce FX 5900 Ultra by almost 70%. The Radeon 9600 Pro manages to come within 4% of NVIDIA's flagship, not bad for a ~$100 card.

At 1280x1024, we're shading more pixels and thus the performance difference increases even further, with the 5900 Ultra being outperformed by 73% this time around.
...
The Radeon 9600 Pro manages to offer extremely good bang for your buck, slightly outperforming the 5900 Ultra.

The performance gap grows to be a massive 61% advantage for the Radeon 9800 Pro over the GeForce FX 5900 Ultra at 1280x1024.

I will say that I feel obobskivich's claim was rather uncalled for. I would never call you that, it's completely uncalled for in this particular forum and thread, plus I think you're an awesome guy and really love some of your projects 😀.

swaaye wrote:

F2BNP's calling R300 and NV30 irrelevant is probably coming from the fact that we would rather use a much faster card for such games now. I ran a 9700 from 2003-2006, but if I want to play those games now, I don't look for a 9700 or 5900 at this point. Unless I am just looking to experiment with old hardware. That's what I think the point is there.

I don't think he thinks R300 was irrelevant at the time, but in the long term does it really matter anymore? There were how many D3D9 cards? Did they really ever stop coming? Are we really away from D3D9 today?! Heh

61680899.jpg

Reply 37 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

I'm more surprised FX delusionists were still a thing in 2015.

Indeed, revisionist history much?
Also, the HL2 thing is another revisionist thing. HL2 itself was already demoed in May 2003 at E3. Leaks and playable demos were around long before it was officially released (which was delayed because of the leak).
So the poor performance of FX series in HL2 was known well before the release date of HL2.
I don't know who this Putas-clown is, but I actually witnessed that era of GPUs first-hand. Also, the lead programmer at FutureMark at the time of the 3DMark03-cheating was a fellow demoscener, and he shared quite a bit of information with me on how 3DMark03 worked internally, and what nVidia was doing.
In fact, some of my own DX8/9 code from that era was based on these 3DMark03 algorithms, most notably the shadow volume algorithms, which ran entirely on the GPU, and were far more efficient than the Doom3 ones.
The point where Carmack fell off his pedestal for me.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 38 of 103, by Scali

User metadata
Rank l33t
Rank
l33t
F2bnp wrote:
Scali wrote:

Where exactly did I do that? Because I don't see me saying anything about '5900XT' before Putas brought it up.

Excuse me, but you're in the wrong here.

No, see bolded section. The post you quote is *below* the post where Putas brings up the 5900XT.

F2bnp wrote:

expensive 5900XT, cheap 9600 Pro

5900XT sold for about twice as much as the 9600Pro (~$200 vs ~$100).
While not as horribly overpriced as the 5900Ultra, it's still far from a good deal.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 39 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++

5900XT runs various D3D7/8 games considerably faster than a 9600XT. That's why people thought it was a good deal. It's primarily what the reviews were covering at the time.

The childish name calling had better come to an abrupt end, by the way.