VOGONS


Bought these (retro) hardware today

Topic actions

Reply 48520 of 52819, by Kahenraz

User metadata
Rank l33t
Rank
l33t
TheAbandonwareGuy wrote on 2023-03-15, 15:01:
TrashPanda wrote on 2023-03-15, 10:35:
Kahenraz wrote on 2023-03-15, 09:20:

I have a couple of GeForce 2 MXs. Despite then being the bottom of the barrel performance-wise for the generation, these are actually excellent cards. They have fantastic OpenGL and DirectX support, performance is greatly improved over the former TNT2 models, and they will generally "just work". These are definitely my favorite cards for test benches and a great choice for basic builds. The ATI Rage 128 Pro is also good, but the GF2 MX is just all-around better. It's also fairly cheap to find; and definitely not to be overlooked.

I have a soft spot for GF2 MX400 cards the 64MB DDR models are crazily reliable and can take some serious abuse and keep on trucking, they are also perfect cards for quake 1/2/3 or unreal LAN boxes. Actually they are pretty much perfect for period games that don't require Pixel Shaders.

So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.

I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never mentioning pixel shaders, which was advertised heavily with the GeForce 3. I never understood stood this at the time, but always assumed that there was some kind of limited support in hardware, but not something that was ever used in any games.

It's possible to still find this information from their website on archive.org; I'm traveling at the moment and can't provide a link.

Does anyone know what would be required to tap into these features and exactly what is supported? I would imagine that it is a kind of fixed function pixel shader and not a programmable one, which was the big feature touted with DirectX 8. I would love to learn more. Is it something that it available through DirectX 7 or 7.1?

Reply 48521 of 52819, by smtkr

User metadata
Rank Member
Rank
Member
BitWrangler wrote on 2023-03-15, 20:34:
Brawndo wrote on 2023-03-15, 19:32:

The inside is very dusty and it looks like it spent some time ingesting some pinkish material, not sure what that's about.

I had one like that some years ago, got to figuring it was makeup, some lady foofing all over her face with a brush or puff on the daily, getting it airborne.

I'd guess some industrial use. I got an SL4KL last year and it was covered in yellow dust. I just left it on the PCB and chips and cleaned the heatsink.

Reply 48522 of 52819, by smtkr

User metadata
Rank Member
Rank
Member
Kahenraz wrote on 2023-03-15, 23:23:
I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never […]
Show full quote
TheAbandonwareGuy wrote on 2023-03-15, 15:01:
TrashPanda wrote on 2023-03-15, 10:35:

I have a soft spot for GF2 MX400 cards the 64MB DDR models are crazily reliable and can take some serious abuse and keep on trucking, they are also perfect cards for quake 1/2/3 or unreal LAN boxes. Actually they are pretty much perfect for period games that don't require Pixel Shaders.

So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.

I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never mentioning pixel shaders, which was advertised heavily with the GeForce 3. I never understood stood this at the time, but always assumed that there was some kind of limited support in hardware, but not something that was ever used in any games.

It's possible to still find this information from their website on archive.org; I'm traveling at the moment and can't provide a link.

Does anyone know what would be required to tap into these features and exactly what is supported? I would imagine that it is a kind of fixed function pixel shader and not a programmable one, which was the big feature touted with DirectX 8. I would love to learn more. Is it something that it available through DirectX 7 or 7.1?

I think the word "shader" at the time was in its infancy and was not formally defined, so developers used it loosely. Creating the standard in DirectX is what made shaders rigidly defined.

Reply 48523 of 52819, by TheAbandonwareGuy

User metadata
Rank Oldbie
Rank
Oldbie
Kahenraz wrote on 2023-03-15, 23:23:
I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never […]
Show full quote
TheAbandonwareGuy wrote on 2023-03-15, 15:01:
TrashPanda wrote on 2023-03-15, 10:35:

I have a soft spot for GF2 MX400 cards the 64MB DDR models are crazily reliable and can take some serious abuse and keep on trucking, they are also perfect cards for quake 1/2/3 or unreal LAN boxes. Actually they are pretty much perfect for period games that don't require Pixel Shaders.

So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.

I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never mentioning pixel shaders, which was advertised heavily with the GeForce 3. I never understood stood this at the time, but always assumed that there was some kind of limited support in hardware, but not something that was ever used in any games.

It's possible to still find this information from their website on archive.org; I'm traveling at the moment and can't provide a link.

Does anyone know what would be required to tap into these features and exactly what is supported? I would imagine that it is a kind of fixed function pixel shader and not a programmable one, which was the big feature touted with DirectX 8. I would love to learn more. Is it something that it available through DirectX 7 or 7.1?

If you launch a Quake III based game there is a console command (which I forget the name of) that will enumerate various hardware features present. On those ATI and NVIDIA cards youll see a lot of weird extensions that start with "NV_" or "ATI_", those are usually the proprietary functions each vendor supported on certain cards. Quake III engine has support for them, but I don't think any games actually use them.

Cyb3rst0rms Retro Hardware Warzone: https://discord.gg/jK8uvR4c
I used to own over 160 graphics card, I've since recovered from graphics card addiction

Reply 48524 of 52819, by TrashPanda

User metadata
Rank l33t
Rank
l33t
TheAbandonwareGuy wrote on 2023-03-16, 00:40:
Kahenraz wrote on 2023-03-15, 23:23:
I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never […]
Show full quote
TheAbandonwareGuy wrote on 2023-03-15, 15:01:

So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.

I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never mentioning pixel shaders, which was advertised heavily with the GeForce 3. I never understood stood this at the time, but always assumed that there was some kind of limited support in hardware, but not something that was ever used in any games.

It's possible to still find this information from their website on archive.org; I'm traveling at the moment and can't provide a link.

Does anyone know what would be required to tap into these features and exactly what is supported? I would imagine that it is a kind of fixed function pixel shader and not a programmable one, which was the big feature touted with DirectX 8. I would love to learn more. Is it something that it available through DirectX 7 or 7.1?

If you launch a Quake III based game there is a console command (which I forget the name of) that will enumerate various hardware features present. On those ATI and NVIDIA cards youll see a lot of weird extensions that start with "NV_" or "ATI_", those are usually the proprietary functions each vendor supported on certain cards. Quake III engine has support for them, but I don't think any games actually use them.

Pretty sure the pixel shader functions for non DX8 cards were used mostly by tech demos and not much else, with no rigid standard before DX8 there was no common frame work to use them so IIRC they were only used infrequently. Not sure any games used only pixel shaders and not vertex shaders too before DX8, would be interesting to find one and see if it runs on a GF2 MX card.

Reply 48525 of 52819, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++
smtkr wrote on 2023-03-15, 23:31:
BitWrangler wrote on 2023-03-15, 20:34:
Brawndo wrote on 2023-03-15, 19:32:

The inside is very dusty and it looks like it spent some time ingesting some pinkish material, not sure what that's about.

I had one like that some years ago, got to figuring it was makeup, some lady foofing all over her face with a brush or puff on the daily, getting it airborne.

I'd guess some industrial use. I got an SL4KL last year and it was covered in yellow dust. I just left it on the PCB and chips and cleaned the heatsink.

Yellow? Did you test it with a Geiger counter? 🤣

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 48526 of 52819, by Kahenraz

User metadata
Rank l33t
Rank
l33t
TrashPanda wrote on 2023-03-16, 03:18:

Pretty sure the pixel shader functions for non DX8 cards were used mostly by tech demos and not much else, with no rigid standard before DX8 there was no common frame work to use them so IIRC they were only used infrequently. Not sure any games used only pixel shaders and not vertex shaders too before DX8, would be interesting to find one and see if it runs on a GF2 MX card.

NVIDIA provided a tiny little picture in their marketing, touting the "per-pixel" lighting feature of the GeForce 2. I never found a tech demo that actually demonstrated the effect live. This confused me terribly, as all of the new pixel shader press kept talking about per-pixel lighting effects, and I couldn't understand why my GeForce 2 couldn't do it if NVIDIA's marketing already said it could. The missing keyword was most likely "programmable" pixel shaders.

Reply 48527 of 52819, by TheAbandonwareGuy

User metadata
Rank Oldbie
Rank
Oldbie
Kahenraz wrote on 2023-03-16, 04:30:
TrashPanda wrote on 2023-03-16, 03:18:

Pretty sure the pixel shader functions for non DX8 cards were used mostly by tech demos and not much else, with no rigid standard before DX8 there was no common frame work to use them so IIRC they were only used infrequently. Not sure any games used only pixel shaders and not vertex shaders too before DX8, would be interesting to find one and see if it runs on a GF2 MX card.

NVIDIA provided a tiny little picture in their marketing, touting the "per-pixel" lighting feature of the GeForce 2. I never found a tech demo that actually demonstrated the effect live. This confused me terribly, as all of the new pixel shader press kept talking about per-pixel lighting effects, and I couldn't understand why my GeForce 2 couldn't do it if NVIDIA's marketing already said it could. The missing keyword was most likely "programmable" pixel shaders.

I think the creature demo does? https://international.download.nvidia.com/dow … os/Creature.exe

"The Creature demo showcased lighting and per-pixel shading like never before in a beautifully-lit underwater scene."

Cyb3rst0rms Retro Hardware Warzone: https://discord.gg/jK8uvR4c
I used to own over 160 graphics card, I've since recovered from graphics card addiction

Reply 48528 of 52819, by Kahenraz

User metadata
Rank l33t
Rank
l33t

You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.

Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?

https://web.archive.org/web/20011022022924/ht … ?IO=feature_nsr

image.gif
Filename
image.gif
File size
25.15 KiB
Views
1192 views
File license
Public domain

The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:

The NSR provides hardware support through its register combiner functionality for the texture blending operation called a dot p […]
Show full quote

The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.

The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.

NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.

This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.

There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.

TheAbandonwareGuy wrote on 2023-03-15, 15:01:

So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.

I believe that this premise is false. Per-pixel lighting and other effects were available as established API extensions as early as DirectX 6, but (programmable) shaders were not available until DirectX 8. Even though the DirectX 6 API supported it, NVIDIA did not provide hardware support until their DirectX 7 product, the GeForce 2 GTS; although I wonder if the feature also exists in the earlier "GeForce" cards as well, but was not advertised. I can only confirm that it's not there on the TNT2.

I also wonder if we can find any DirectX 6 cards from other manufacturers with per-pixel lighting support.

Last edited by Kahenraz on 2023-03-16, 07:52. Edited 1 time in total.

Reply 48529 of 52819, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
Kahenraz wrote on 2023-03-16, 07:42:
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great a […]
Show full quote

You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.

Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?

https://web.archive.org/web/20011022022924/ht … ?IO=feature_nsr

image.gif

The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:

The NSR provides hardware support through its register combiner functionality for the texture blending operation called a dot p […]
Show full quote

The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.

The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.

NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.

This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.

There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.

Oh, that's Giants: Citizen Kobuto.

That game was heavily advertised at the time for utilizing all the latest graphical features of nvidia's Geforce cards. I think they actually recommended a Geforce, which was a tall order back in December of 2000.

The game used some pretty advanced bump mapping and had some interesting effects, but I think overall the game didn't really look that great because it had been in development so long. The ground textures and pointy geometry just looked terrible combined with the much more realistic character textures. I'm sure others think it looks great... this is just my impression of it. Outcast made use of a lot of similar effects purely with software rendering the year prior, and the low res graphics and odd voxel based scenery had it's quirks but it looked like it all went together. Giants looked like someone knew how to make high quality models and textures with the latest GPU tech available, but then they plopped them into the scenery out of a game from 1996.

Last edited by Ozzuneoj on 2023-03-16, 07:59. Edited 2 times in total.

Now for some blitting from the back buffer.

Reply 48530 of 52819, by TrashPanda

User metadata
Rank l33t
Rank
l33t
Kahenraz wrote on 2023-03-16, 07:42:
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great a […]
Show full quote

You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.

Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?

https://web.archive.org/web/20011022022924/ht … ?IO=feature_nsr

image.gif

The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:

The NSR provides hardware support through its register combiner functionality for the texture blending operation called a dot p […]
Show full quote

The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.

The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.

NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.

This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.

There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.

Programmable shaders didn't happen till DX8.0 PS1.1 and the Geforce 3, before that they were fixed function so you wont hear them used before its introduction, ATI had programmable shaders around Radeon 8500/9100 (R200) with the introduction of their smart shader tech which was more advanced than the shader tech on the Geforce 3 as ATI fully supported DX8.1 PS1.4 which nVidia didn't till GF4 IIRC. (Geforce 4 only supports PS1.3 not 1.4 so isn't actually DX8.1 compliant but is supported)

You could argue that they did exist before DX 8.0, but they were not standardized and had no API to support them in games and many were still fixed function.

Reply 48531 of 52819, by dionb

User metadata
Rank l33t++
Rank
l33t++

Woohoo!

Just picked up an old (and heavy...) KVM switch with serial ports. The seller then said "I also have this bag with random old cards. Want it?"

Hell yes. Turns out it's a very big, heavy bag. Can't unpack it yet, but second card (after an 8b ISA printer card) was a PAS16. Looking forward to sifting through the rest 😀

Reply 48532 of 52819, by ediflorianUS

User metadata
Rank Member
Rank
Member
Turbo -> wrote on 2023-03-15, 16:55:

My colleague was cleaning his storage and gave me some PC cards. This is what I got.

Nice card's I love the wildcat's . I wonder if I can run dosbox on my Spring Machintosh.... hmmm....

My 80486-S i66 Project

Reply 48533 of 52819, by OMORES

User metadata
Rank Member
Rank
Member

This Soyo SY-025P2 Socket 3 motherboard - looks brand new. I replaced the leaky Ni-Cd battery with a 2032 socket.

The socket has no lever... but comes with metallic holes...

Didn't test it yet.

Attachments

My best video so far.

Reply 48535 of 52819, by original_meusli

User metadata
Rank Newbie
Rank
Newbie
OMORES wrote on 2023-03-16, 14:19:

This Soyo SY-025P2 Socket 3 motherboard - looks brand new. I replaced the leaky Ni-Cd battery with a 2032 socket.

The socket has no lever... but comes with metallic holes...

Didn't test it yet.

Sorry if you already know this but I nearly messed this up myself, until I did some last minute research. You need to make sure that you change the recharge battery loop on the motherboard otherwise it won't keep the battery charged.

https://www.youtube.com/watch?v=1mEk9Q1QtcU

Reply 48536 of 52819, by dionb

User metadata
Rank l33t++
Rank
l33t++
dionb wrote on 2023-03-16, 08:46:

Woohoo!

Just picked up an old (and heavy...) KVM switch with serial ports. The seller then said "I also have this bag with random old cards. Want it?"

Hell yes. Turns out it's a very big, heavy bag. Can't unpack it yet, but second card (after an 8b ISA printer card) was a PAS16. Looking forward to sifting through the rest 😀

So, the haul:

A pile of PCI stuff:

16789818119390.jpg
Filename
16789818119390.jpg
File size
156.64 KiB
Views
1049 views
File license
CC-BY-4.0

Nice SCSI cards

A pile of ISA/VLB stuff:

16789818527711.jpg
Filename
16789818527711.jpg
File size
131.87 KiB
Views
1049 views
File license
CC-BY-4.0

PAS16 and Biostar motherboards nice highlights.

And these two:

16789816080870.jpg
Filename
16789816080870.jpg
File size
215.64 KiB
Views
1049 views
File license
CC-BY-4.0

At last an ESDI controller - and as good as new, with manual.

16789816355891.jpg
Filename
16789816355891.jpg
File size
239.64 KiB
Views
1049 views
File license
CC-BY-4.0

Weird little Vendex I/O controller with MFM or RLL HDD, game port and proprietary bus mouse. Note the last line on the WD42C22A HDD controller - apparently they used a prototype 😜

Reply 48537 of 52819, by TheAbandonwareGuy

User metadata
Rank Oldbie
Rank
Oldbie
Kahenraz wrote on 2023-03-16, 07:42:
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great a […]
Show full quote

You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.

Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?

https://web.archive.org/web/20011022022924/ht … ?IO=feature_nsr

image.gif

The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:

The NSR provides hardware support through its register combiner functionality for the texture blending operation called a dot p […]
Show full quote

The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.

The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.

NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.

This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.

There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.

TheAbandonwareGuy wrote on 2023-03-15, 15:01:

So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.

I believe that this premise is false. Per-pixel lighting and other effects were available as established API extensions as early as DirectX 6, but (programmable) shaders were not available until DirectX 8. Even though the DirectX 6 API supported it, NVIDIA did not provide hardware support until their DirectX 7 product, the GeForce 2 GTS; although I wonder if the feature also exists in the earlier "GeForce" cards as well, but was not advertised. I can only confirm that it's not there on the TNT2.

I also wonder if we can find any DirectX 6 cards from other manufacturers with per-pixel lighting support.

GeForce256 was supposed to have a number of GF2 features, but they are broken in silicon and disabled in vBIOs. Most likely NVIDIA ran out of time to do revisions and fix bugs before they had to start producing shipping silicon.

Cyb3rst0rms Retro Hardware Warzone: https://discord.gg/jK8uvR4c
I used to own over 160 graphics card, I've since recovered from graphics card addiction

Reply 48538 of 52819, by TheAbandonwareGuy

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on 2023-03-16, 07:49:
Oh, that's Giants: Citizen Kobuto. […]
Show full quote
Kahenraz wrote on 2023-03-16, 07:42:
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great a […]
Show full quote

You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.

Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?

https://web.archive.org/web/20011022022924/ht … ?IO=feature_nsr

image.gif

The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:

The NSR provides hardware support through its register combiner functionality for the texture blending operation called a dot p […]
Show full quote

The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.

The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.

NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.

This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.

There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.

Oh, that's Giants: Citizen Kobuto.

That game was heavily advertised at the time for utilizing all the latest graphical features of nvidia's Geforce cards. I think they actually recommended a Geforce, which was a tall order back in December of 2000.

The game used some pretty advanced bump mapping and had some interesting effects, but I think overall the game didn't really look that great because it had been in development so long. The ground textures and pointy geometry just looked terrible combined with the much more realistic character textures. I'm sure others think it looks great... this is just my impression of it. Outcast made use of a lot of similar effects purely with software rendering the year prior, and the low res graphics and odd voxel based scenery had it's quirks but it looked like it all went together. Giants looked like someone knew how to make high quality models and textures with the latest GPU tech available, but then they plopped them into the scenery out of a game from 1996.

Outcast also required what was considered a monsterous PC to run at a reasonable frame rate. IIRC a 1GHZ Pentium (then the fastest on the market) barely produced smooth framerates.

Cyb3rst0rms Retro Hardware Warzone: https://discord.gg/jK8uvR4c
I used to own over 160 graphics card, I've since recovered from graphics card addiction

Reply 48539 of 52819, by Kahenraz

User metadata
Rank l33t
Rank
l33t
OMORES wrote on 2023-03-16, 14:19:

This Soyo SY-025P2 Socket 3 motherboard - looks brand new. I replaced the leaky Ni-Cd battery with a 2032 socket.

The socket has no lever... but comes with metallic holes...

Didn't test it yet.

I didn't think it was safe to replace a NiCd battery with a Lithium one. I've heard that a NiMH battery can sometimes be used, but the charge voltage is not the same even for that formula.