The inside is very dusty and it looks like it spent some time ingesting some pinkish material, not sure what that's about.
I had one like that some years ago, got to figuring it was makeup, some lady foofing all over her face with a brush or puff on the daily, getting it airborne.
I'd guess some industrial use. I got an SL4KL last year and it was covered in yellow dust. I just left it on the PCB and chips and cleaned the heatsink.
Kahenrazwrote on 2023-03-15, 23:23:I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never […] Show full quote
I have a soft spot for GF2 MX400 cards the 64MB DDR models are crazily reliable and can take some serious abuse and keep on trucking, they are also perfect cards for quake 1/2/3 or unreal LAN boxes. Actually they are pretty much perfect for period games that don't require Pixel Shaders.
So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.
I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never mentioning pixel shaders, which was advertised heavily with the GeForce 3. I never understood stood this at the time, but always assumed that there was some kind of limited support in hardware, but not something that was ever used in any games.
It's possible to still find this information from their website on archive.org; I'm traveling at the moment and can't provide a link.
Does anyone know what would be required to tap into these features and exactly what is supported? I would imagine that it is a kind of fixed function pixel shader and not a programmable one, which was the big feature touted with DirectX 8. I would love to learn more. Is it something that it available through DirectX 7 or 7.1?
I think the word "shader" at the time was in its infancy and was not formally defined, so developers used it loosely. Creating the standard in DirectX is what made shaders rigidly defined.
Kahenrazwrote on 2023-03-15, 23:23:I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never […] Show full quote
I have a soft spot for GF2 MX400 cards the 64MB DDR models are crazily reliable and can take some serious abuse and keep on trucking, they are also perfect cards for quake 1/2/3 or unreal LAN boxes. Actually they are pretty much perfect for period games that don't require Pixel Shaders.
So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.
I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never mentioning pixel shaders, which was advertised heavily with the GeForce 3. I never understood stood this at the time, but always assumed that there was some kind of limited support in hardware, but not something that was ever used in any games.
It's possible to still find this information from their website on archive.org; I'm traveling at the moment and can't provide a link.
Does anyone know what would be required to tap into these features and exactly what is supported? I would imagine that it is a kind of fixed function pixel shader and not a programmable one, which was the big feature touted with DirectX 8. I would love to learn more. Is it something that it available through DirectX 7 or 7.1?
If you launch a Quake III based game there is a console command (which I forget the name of) that will enumerate various hardware features present. On those ATI and NVIDIA cards youll see a lot of weird extensions that start with "NV_" or "ATI_", those are usually the proprietary functions each vendor supported on certain cards. Quake III engine has support for them, but I don't think any games actually use them.
Cyb3rst0rms Retro Hardware Warzone:https://discord.gg/jK8uvR4c I used to own over 160 graphics card, I've since recovered from graphics card addiction
Kahenrazwrote on 2023-03-15, 23:23:I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never […] Show full quote
So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.
I can confirm this for the GeForce 2! I remember NVIDIA advertising per-pixel lighting for this card in their website, but never mentioning pixel shaders, which was advertised heavily with the GeForce 3. I never understood stood this at the time, but always assumed that there was some kind of limited support in hardware, but not something that was ever used in any games.
It's possible to still find this information from their website on archive.org; I'm traveling at the moment and can't provide a link.
Does anyone know what would be required to tap into these features and exactly what is supported? I would imagine that it is a kind of fixed function pixel shader and not a programmable one, which was the big feature touted with DirectX 8. I would love to learn more. Is it something that it available through DirectX 7 or 7.1?
If you launch a Quake III based game there is a console command (which I forget the name of) that will enumerate various hardware features present. On those ATI and NVIDIA cards youll see a lot of weird extensions that start with "NV_" or "ATI_", those are usually the proprietary functions each vendor supported on certain cards. Quake III engine has support for them, but I don't think any games actually use them.
Pretty sure the pixel shader functions for non DX8 cards were used mostly by tech demos and not much else, with no rigid standard before DX8 there was no common frame work to use them so IIRC they were only used infrequently. Not sure any games used only pixel shaders and not vertex shaders too before DX8, would be interesting to find one and see if it runs on a GF2 MX card.
The inside is very dusty and it looks like it spent some time ingesting some pinkish material, not sure what that's about.
I had one like that some years ago, got to figuring it was makeup, some lady foofing all over her face with a brush or puff on the daily, getting it airborne.
I'd guess some industrial use. I got an SL4KL last year and it was covered in yellow dust. I just left it on the PCB and chips and cleaned the heatsink.
Yellow? Did you test it with a Geiger counter? 🤣
Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.
Pretty sure the pixel shader functions for non DX8 cards were used mostly by tech demos and not much else, with no rigid standard before DX8 there was no common frame work to use them so IIRC they were only used infrequently. Not sure any games used only pixel shaders and not vertex shaders too before DX8, would be interesting to find one and see if it runs on a GF2 MX card.
NVIDIA provided a tiny little picture in their marketing, touting the "per-pixel" lighting feature of the GeForce 2. I never found a tech demo that actually demonstrated the effect live. This confused me terribly, as all of the new pixel shader press kept talking about per-pixel lighting effects, and I couldn't understand why my GeForce 2 couldn't do it if NVIDIA's marketing already said it could. The missing keyword was most likely "programmable" pixel shaders.
Pretty sure the pixel shader functions for non DX8 cards were used mostly by tech demos and not much else, with no rigid standard before DX8 there was no common frame work to use them so IIRC they were only used infrequently. Not sure any games used only pixel shaders and not vertex shaders too before DX8, would be interesting to find one and see if it runs on a GF2 MX card.
NVIDIA provided a tiny little picture in their marketing, touting the "per-pixel" lighting feature of the GeForce 2. I never found a tech demo that actually demonstrated the effect live. This confused me terribly, as all of the new pixel shader press kept talking about per-pixel lighting effects, and I couldn't understand why my GeForce 2 couldn't do it if NVIDIA's marketing already said it could. The missing keyword was most likely "programmable" pixel shaders.
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.
Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?
The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot p […] Show full quote
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.
The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.
NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.
This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.
There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.
So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.
I believe that this premise is false. Per-pixel lighting and other effects were available as established API extensions as early as DirectX 6, but (programmable) shaders were not available until DirectX 8. Even though the DirectX 6 API supported it, NVIDIA did not provide hardware support until their DirectX 7 product, the GeForce 2 GTS; although I wonder if the feature also exists in the earlier "GeForce" cards as well, but was not advertised. I can only confirm that it's not there on the TNT2.
I also wonder if we can find any DirectX 6 cards from other manufacturers with per-pixel lighting support.
Last edited by Kahenraz on 2023-03-16, 07:52. Edited 1 time in total.
Kahenrazwrote on 2023-03-16, 07:42:You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great a […] Show full quote
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.
Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?
The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot p […] Show full quote
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.
The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.
NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.
This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.
There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.
Oh, that's Giants: Citizen Kobuto.
That game was heavily advertised at the time for utilizing all the latest graphical features of nvidia's Geforce cards. I think they actually recommended a Geforce, which was a tall order back in December of 2000.
The game used some pretty advanced bump mapping and had some interesting effects, but I think overall the game didn't really look that great because it had been in development so long. The ground textures and pointy geometry just looked terrible combined with the much more realistic character textures. I'm sure others think it looks great... this is just my impression of it. Outcast made use of a lot of similar effects purely with software rendering the year prior, and the low res graphics and odd voxel based scenery had it's quirks but it looked like it all went together. Giants looked like someone knew how to make high quality models and textures with the latest GPU tech available, but then they plopped them into the scenery out of a game from 1996.
Last edited by Ozzuneoj on 2023-03-16, 07:59. Edited 2 times in total.
Kahenrazwrote on 2023-03-16, 07:42:You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great a […] Show full quote
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.
Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?
The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot p […] Show full quote
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.
The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.
NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.
This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.
There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.
Programmable shaders didn't happen till DX8.0 PS1.1 and the Geforce 3, before that they were fixed function so you wont hear them used before its introduction, ATI had programmable shaders around Radeon 8500/9100 (R200) with the introduction of their smart shader tech which was more advanced than the shader tech on the Geforce 3 as ATI fully supported DX8.1 PS1.4 which nVidia didn't till GF4 IIRC. (Geforce 4 only supports PS1.3 not 1.4 so isn't actually DX8.1 compliant but is supported)
You could argue that they did exist before DX 8.0, but they were not standardized and had no API to support them in games and many were still fixed function.
Just picked up an old (and heavy...) KVM switch with serial ports. The seller then said "I also have this bag with random old cards. Want it?"
Hell yes. Turns out it's a very big, heavy bag. Can't unpack it yet, but second card (after an 8b ISA printer card) was a PAS16. Looking forward to sifting through the rest 😀
This Soyo SY-025P2 Socket 3 motherboard - looks brand new. I replaced the leaky Ni-Cd battery with a 2032 socket.
The socket has no lever... but comes with metallic holes...
Didn't test it yet.
Sorry if you already know this but I nearly messed this up myself, until I did some last minute research. You need to make sure that you change the recharge battery loop on the motherboard otherwise it won't keep the battery charged.
Just picked up an old (and heavy...) KVM switch with serial ports. The seller then said "I also have this bag with random old cards. Want it?"
Hell yes. Turns out it's a very big, heavy bag. Can't unpack it yet, but second card (after an 8b ISA printer card) was a PAS16. Looking forward to sifting through the rest 😀
So, the haul:
A pile of PCI stuff:
Nice SCSI cards
A pile of ISA/VLB stuff:
PAS16 and Biostar motherboards nice highlights.
And these two:
At last an ESDI controller - and as good as new, with manual.
Weird little Vendex I/O controller with MFM or RLL HDD, game port and proprietary bus mouse. Note the last line on the WD42C22A HDD controller - apparently they used a prototype 😜
Kahenrazwrote on 2023-03-16, 07:42:You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great a […] Show full quote
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.
Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?
The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot p […] Show full quote
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.
The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.
NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.
This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.
There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.
So fun fact GeForce2MX (Along with all other GF2s) and ATI Radeon both support pixel shaders 1.0a, but neither fully supports Vertex shaders (Radeon has some quasi Vertex capabilities via ATI specific OGL extensions that almost nothing supports, GeForce has zero Vertex support). Thus both are somewhere in the middle between being DirectX 7 and DirectX8 class video cards.
I believe that this premise is false. Per-pixel lighting and other effects were available as established API extensions as early as DirectX 6, but (programmable) shaders were not available until DirectX 8. Even though the DirectX 6 API supported it, NVIDIA did not provide hardware support until their DirectX 7 product, the GeForce 2 GTS; although I wonder if the feature also exists in the earlier "GeForce" cards as well, but was not advertised. I can only confirm that it's not there on the TNT2.
I also wonder if we can find any DirectX 6 cards from other manufacturers with per-pixel lighting support.
GeForce256 was supposed to have a number of GF2 features, but they are broken in silicon and disabled in vBIOs. Most likely NVIDIA ran out of time to do revisions and fix bugs before they had to start producing shipping silicon.
Cyb3rst0rms Retro Hardware Warzone:https://discord.gg/jK8uvR4c I used to own over 160 graphics card, I've since recovered from graphics card addiction
Kahenrazwrote on 2023-03-16, 07:42:You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great a […] Show full quote
You're right. I can confirm this quote. I didn't recall that the Creature demo used the per-pixel lighting feature. It's great at we do have at least one demo.
Here is a link to the article I was thinking of previously. Does anyone know if the example image is from a demo or a game?
The technology is specifically called the "NVIDIA Shading Rasterizer (NSR)", and was introduced with the GeForce 2 GTS, and I assume is present in all of the GeForce 2 silicon. It seems to have been available as an API feature as of DirectX 6, and an OpenGL extension:
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot p […] Show full quote
The NSR provides hardware support through its register combiner functionality for the texture blending
operation called a dot product that makes per-pixel calculations possible. Available to developers in
the Microsoft®
DirectX®
6 API and above (D3DTOP_DOTPRODUCT3) as well as OpenGL®
(via the
NV_register_combiners extension), dot-product texture-blending operations allow diffuse, specular,
spot, and point light effects to be calculated dynamically on a per-pixel basis.
The "Lightning" demo also showcases the per-pixel lighting feature as well. I guess it was used more often than I thought.
NVIDIA originally created this technical demo to show off the benefits of per-pixel lighting.
This also finally explains for me why the little water demo with the lamp post doesn't render properly on TNT cards. It's definitely the per-pixel lighting feature.
There is definitely lots of mention of per-pixel lighting, but nothing about programmable shaders.
Oh, that's Giants: Citizen Kobuto.
That game was heavily advertised at the time for utilizing all the latest graphical features of nvidia's Geforce cards. I think they actually recommended a Geforce, which was a tall order back in December of 2000.
The game used some pretty advanced bump mapping and had some interesting effects, but I think overall the game didn't really look that great because it had been in development so long. The ground textures and pointy geometry just looked terrible combined with the much more realistic character textures. I'm sure others think it looks great... this is just my impression of it. Outcast made use of a lot of similar effects purely with software rendering the year prior, and the low res graphics and odd voxel based scenery had it's quirks but it looked like it all went together. Giants looked like someone knew how to make high quality models and textures with the latest GPU tech available, but then they plopped them into the scenery out of a game from 1996.
Outcast also required what was considered a monsterous PC to run at a reasonable frame rate. IIRC a 1GHZ Pentium (then the fastest on the market) barely produced smooth framerates.
Cyb3rst0rms Retro Hardware Warzone:https://discord.gg/jK8uvR4c I used to own over 160 graphics card, I've since recovered from graphics card addiction
This Soyo SY-025P2 Socket 3 motherboard - looks brand new. I replaced the leaky Ni-Cd battery with a 2032 socket.
The socket has no lever... but comes with metallic holes...
Didn't test it yet.
I didn't think it was safe to replace a NiCd battery with a Lithium one. I've heard that a NiMH battery can sometimes be used, but the charge voltage is not the same even for that formula.
I didn't think it was safe to replace a NiCd battery with a Lithium one. I've heard that a NiMH battery can sometimes be used, but the charge voltage is not the same even for that formula.
Changing a rechargeable Ni-Cd with a non-rechargeable Lithium is bad news unless the circuit is also changed to prevent recharging.
I have been using a 3.6V rechargeable Li-ion battery installed in a 486 system and seems to be working fine and holding a charge properly. Though I will admit I'm not completely sure if this is a good idea or not.
Last edited by Shponglefan on 2023-03-16, 18:56. Edited 3 times in total.