VOGONS


Geforce2 shading rasterizer info

Topic actions

Reply 20 of 36, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Apparently the game Evolva uses NSR for some effects.
http://www.nvidia.com/object/evolva.html
http://www.tomshardware.com/reviews/tom,184-11.html

A list of games that do something with it would be great.

Though NV didn't have EMBM support until GeForce 3 which had them behind ATI and even Matrox for years. On the other hand, I've read NSR is more useful than ATI's Pixel Tapestry. This is partly why Doom3 didn't support first generation Radeons.

Reply 21 of 36, by Scali

User metadata
Rank l33t
Rank
l33t

NSR is just a marketing term though. It's just the name of their pixel pipeline.
Technically all games use it, although not all of them may use the new effects introduced with the GF series.

swaaye wrote:

Though NV didn't have EMBM support until GeForce 3 which had them behind ATI and even Matrox for years.

EMBM is just a simple 2D texturing trick though, where dot3 is 'proper' per-pixel lighting.
What EMBM does is to have two textures:
1) A perturbation map
2) A 2d environment map (spherical texture)

Instead of just calculating the texture coordinates for the environmemt map per-pixel, EMBM will do a lookup in the perturbation map at every pixel, and add these values to the texture coordinates before doing the envmap lookup.
The effect will sorta look like per-pixel bumpmapping, but it is obviously not a mathematical lighting model. It's purely image-based, and these spherical maps have various limitations, and are usually not updated realtime with accurate lighting info, so they're just static.

Dot3 on the other hand is just what it says: it performs a 3d dotproduct per-pixel.
And that is *real* per-pixel lighting: you feed it two 3d vectors as input (which can either come from interpolated vertex data, such as the colour channels or texture coordinates, or from a normalmap texture), and it will calculate the dotproduct between them.
Dotproducts are obviously a big part of the usual lighting models, such as Lambert and Blinn-Phong.
So combining the dot3 with the proper input and some extra attenuation factors (possibly multiple passes), you can do very advanced per-pixel lighting with a true mathematical lighting model, with multiple dynamic lightsources.

So dot3 is the 'real thing', where EMBM is 'kids stuff'. So it was not nVidia who was behind.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 23 of 36, by Scali

User metadata
Rank l33t
Rank
l33t
386SX wrote:

But EMBM was already possibily supported in the 1999, wasn't it? For those time correct games EMBM could have made some difference (just like S3TC support before being generally supported).

Yes, but there were only a handful of such games at best. The limitations of EMBM didn't make it very practical in actual games, and if I'm not mistaken, the dependent texture reads required for the effect to work weren't very efficient in the first generation of hardware, so it didn't perform very well.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 24 of 36, by 386SX

User metadata
Rank l33t
Rank
l33t
Scali wrote:
386SX wrote:

But EMBM was already possibily supported in the 1999, wasn't it? For those time correct games EMBM could have made some difference (just like S3TC support before being generally supported).

Yes, but there were only a handful of such games at best. The limitations of EMBM didn't make it very practical in actual games, and if I'm not mistaken, the dependent texture reads required for the effect to work weren't very efficient in the first generation of hardware, so it didn't perform very well.

I remember only a game but not the name of it, supporting it at first and you could see in all the reviews (that scene of the water quiet impressive by the way...).
I remember at that time we could be happy to see the improvements (?) in rendering quality from a Voodoo2 to a Voodoo3 so anything like EMBM or S3TC was anyway impressive.. 😁

Reply 25 of 36, by F2bnp

User metadata
Rank l33t
Rank
l33t

You must be thinking of Expendable by Rage Software. Fascinating stuff, I had no idea the GeForce2 could do anything of the sort. For years I've considered it a minor upgrade to/rework of the original design.

Reply 26 of 36, by Scali

User metadata
Rank l33t
Rank
l33t
F2bnp wrote:

Fascinating stuff, I had no idea the GeForce2 could do anything of the sort. For years I've considered it a minor upgrade to/rework of the original design.

It is. The GeForce 256 also has the 'NSR', and has basically the same featureset (if you have a GF256, please run DXCapsViewer from the DirectX SDK, and post some screenshots. It should show dot3 and the shadowmapping features).
Biggest problem with the 256 was that its performance was somewhat underwhelming. GF2 came with a big performance boost (die-shrink, more pipelines, faster memory). Also, software was more mature, so you started to see more of its capabilities in actual games now.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 27 of 36, by F2bnp

User metadata
Rank l33t
Rank
l33t

Woah nice. GeForce2 was a nice boost indeed, from a time when die shrinks were really easy. Nowadays we're stuck in 28nm for almost 5 years now. 😒

Did ATi catch up from 8500 and onwards only? Or did they have their own proprietary "NSR" like features as well with the first Radeon cards?

Reply 28 of 36, by 386SX

User metadata
Rank l33t
Rank
l33t
Scali wrote:
F2bnp wrote:

Fascinating stuff, I had no idea the GeForce2 could do anything of the sort. For years I've considered it a minor upgrade to/rework of the original design.

It is. The GeForce 256 also has the 'NSR', and has basically the same featureset (if you have a GF256, please run DXCapsViewer from the DirectX SDK, and post some screenshots. It should show dot3 and the shadowmapping features).
Biggest problem with the 256 was that its performance was somewhat underwhelming. GF2 came with a big performance boost (die-shrink, more pipelines, faster memory). Also, software was more mature, so you started to see more of its capabilities in actual games now.

In terms of real world performance I was more impressed by just the DDR version of the Geforce. But also the original Radeon probably deserved more attention for its capabilities.

Last edited by 386SX on 2016-01-22, 14:24. Edited 1 time in total.

Reply 30 of 36, by Scali

User metadata
Rank l33t
Rank
l33t
F2bnp wrote:

Did ATi catch up from 8500 and onwards only? Or did they have their own proprietary "NSR" like features as well with the first Radeon cards?

Yes, they had "Pixel Tapestry": http://www.anandtech.com/show/536/6
Back in the day, there was a rumour that this was actually the first 'pixelshader' hardware, and that ps1.0 would have been for this card... but for some reason ps1.0 never actually surfaced, and things started with ps1.1 (and vs1.1) on the GeForce3.

The Radeon 8500 was actually significantly more advanced than the GeForce3 or even the GeForce4. Its ps1.4 was a huge leap up from ps1.3 and lower. It was very close to ps2.0 in capabilities, just at reduced precision (although still more precision than ps1.3 and lower).
The biggest problem however was that the GeForce4 delivered a huge amount of raw performance, and ATi could not come up with a decent competitor in terms of performance until they released the revolutionary Radeon 9700.
I think the Radeon 8500 and 9700 were perhaps ATi's finest moments. The original Radeon was a good step up to nVidia, and was probably mainly plagued by driver issues. They didn't really get their drivers under control until late in the Radeon 8500's life, when they released the Catalyst line of drivers.

Edit:
I think this is the original whitepaper for Pixel Tapestry: http://cgg.mff.cuni.cz/~semancik/research/res … paperradeon.pdf

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 32 of 36, by swaaye

User metadata
Rank l33t++
Rank
l33t++

There are quite a few games with EMBM.
www.vogons.org/viewtopic.php?f=46&t=40754#p380117

Scali wrote:

Biggest problem with the 256 was that its performance was somewhat underwhelming. GF2 came with a big performance boost (die-shrink, more pipelines, faster memory). Also, software was more mature, so you started to see more of its capabilities in actual games now.

I saw some possibly informed comments on Beyond3D that NV10 had the twin TMUs per pipeline but there was a flaw that forced NV to run them as a single unit with "free" trilinear.

Reply 33 of 36, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:

The Radeon 8500 was actually significantly more advanced than the GeForce3 or even the GeForce4. Its ps1.4 was a huge leap up from ps1.3 and lower. It was very close to ps2.0 in capabilities, just at reduced precision (although still more precision than ps1.3 and lower).
The biggest problem however was that the GeForce4 delivered a huge amount of raw performance, and ATi could not come up with a decent competitor in terms of performance until they released the revolutionary Radeon 9700.
I think the Radeon 8500 and 9700 were perhaps ATi's finest moments. The original Radeon was a good step up to nVidia, and was probably mainly plagued by driver issues. They didn't really get their drivers under control until late in the Radeon 8500's life, when they released the Catalyst line of drivers.

Yeah 8500 was interesting. I ran one during their time. It was a real shame about the drivers. But I get the impression there was more to 8500's problems than just drivers.

The most insight I've read is John Carmack's Doom3 .plan commentary. Lots of driver bugs but he also mentions that he was disappointed with gains when trying to take advantage of its greater flexibility.
http://www.bluesnews.com/cgi-bin/finger.pl?id … =20020627230700

Reply 34 of 36, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Yeah 8500 was interesting. I ran one during their time. It was a real shame about the drivers. But I get the impression there was more to 8500's problems than just drivers.

Yea, I got a Radeon 8500 above a GF3 at the time. Was slightly sad when the GF4 came out and showed a huge leap in performance... But as a developer, ps1.4 was much cooler than ps1.3.
I still have mine, in a Pentium II 350 system. A few years ago, I actually used it to 'harden' the Direct3D engine I used for my company. While it was only meant for DX9+ SM2.0+ hardware, I revitalized the old SM1.x and fixedfunction code, and also caught a few bugs in the process.
So I can still use my current engine to develop stuff for the Radeon 8500 now.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 35 of 36, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:

Yea, I got a Radeon 8500 above a GF3 at the time. Was slightly sad when the GF4 came out and showed a huge leap in performance... But as a developer, ps1.4 was much cooler than ps1.3.
I still have mine, in a Pentium II 350 system. A few years ago, I actually used it to 'harden' the Direct3D engine I used for my company. While it was only meant for DX9+ SM2.0+ hardware, I revitalized the old SM1.x and fixedfunction code, and also caught a few bugs in the process.
So I can still use my current engine to develop stuff for the Radeon 8500 now.

I think my favorite thing about the 8500 was its ability to run 16x anisotropic filtering with a barely noticeable performance hit. You couldn't do that with GF3-4. Though by today's standards the 8500's AF is terrible. Or even Radeon 9700's standards.

8500's anti-aliasing is particularly curious. Lots of strange mystery modes like 3x and 5x, and all having a performance/quality option. But it doesn't enable half of the time, and is usually too slow unless you play at 640x480 because it's SSAA.

Reply 36 of 36, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t

I just stumbled across this video of the Nvidia Small Pond Demo. I'm guessing NSR was used for the water effects? Back in the day, I had a GeForce2 MX400, and I certainly don't remember seeing anything like that in contemporary games.

The first time I encountered reflective water in an actual game was in early 2003 when I bought a used GeForce3 and played Morrowind on it. But looking at the pond demo, it appears that the GeForce2 may have been capable of that as well. It just seems that most game developers opted not to utilize that functionality.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi