VOGONS


First post, by 386SX

User metadata
Rank l33t
Rank
l33t

Hello,

I was trying my usual 430VX board with its P-MMX 233, a Virge DX 4MB and a Diamond Voodoo II 8MB I'm testing but I already tested it in the past and perfectly functional. But is it normal that even with Win98 patched to latest updates and DirectX still the latest reference Dx7 drivers seems to install but can't read the info of the card like memory etc.. from control panel while if I install the Dx6 previous 1999 version it read it correctly? Were latest drivers reserved for the original 3dfx STB built PCB?
I think it's not the first time I've seen this problem in older tests and I was wondering which is the problem with latest drivers and these card (I got only Diamond boards and a Creative).
I'm reinstalling the o.s. cause I notice the Virge once installed return to the desktop still on 16 colors and 640x480 while the card sure is functional. I suppose something is wrong with the installation but the problem above is definetely not the first time I remember I've seen. I can't say if beside the control panel info reading, this driver works in games or not here but I'll update as soon as the installation finish.

EDIT: with the Dx6 reference drivers the cards seems to work ok, I got 2.35 score in Final Reality and 2.75 in 3D tests with the system above. I'll try update to the latest Dx7 ones to see if at least they work beside the control panel info.
Thanks

Reply 1 of 15, by TrashPanda

User metadata
Rank l33t
Rank
l33t

Its a Diamond Voodoo 2 always install the official diamond monster Voodoo2 drivers first then install the official ones over the top, I own a pair of 12mb Monster IIs and have always had issues if I dont install the official diamond drivers first. Diamond did some stupid shit with their cards which the official drivers have issues with, to this day even with all my testing I have yet to figure out what exactly they did but I'm guessing its a registry entry or setting. Its dang annoying too as you wont notice issues till you try to use the Voodoo control panel or windows flakes out and you get odd compatibility issues.

Not even the unofficial community drivers can get around Diamond Voodoo 2 weirdness.

Also you will find the final 3DFX Voodoo 2 drivers were buggy as shit, I dont think they were even finished before 3DFX shut down but hey thats par for course with 3DFX, they left the scene with both the Voodoo 5 5500 and V5 6000 in an incredibly buggy unfinished state, they never did fix the SLI issues.

But that all part of the charm really, I still love 3DFX and Glide.

Reply 2 of 15, by 386SX

User metadata
Rank l33t
Rank
l33t
TrashPanda wrote on 2022-02-09, 00:31:
Its a Diamond Voodoo 2 always install the official diamond monster Voodoo2 drivers first then install the official ones over the […]
Show full quote

Its a Diamond Voodoo 2 always install the official diamond monster Voodoo2 drivers first then install the official ones over the top, I own a pair of 12mb Monster IIs and have always had issues if I dont install the official diamond drivers first. Diamond did some stupid shit with their cards which the official drivers have issues with, to this day even with all my testing I have yet to figure out what exactly they did but I'm guessing its a registry entry or setting. Its dang annoying too as you wont notice issues till you try to use the Voodoo control panel or windows flakes out and you get odd compatibility issues.

Not even the unofficial community drivers can get around Diamond Voodoo 2 weirdness.

Also you will find the final 3DFX Voodoo 2 drivers were buggy as shit, I dont think they were even finished before 3DFX shut down but hey thats par for course with 3DFX, they left the scene with both the Voodoo 5 5500 and V5 6000 in an incredibly buggy unfinished state, they never did fix the SLI issues.

But that all part of the charm really, I still love 3DFX and Glide.

Thank you, this's interesting and confirm my memories about having problems with reference drivers and these cards. I've installed the latest drivers on a fresh config (without updates, IE6.0, Dx9 etc.. that seems to make the o.s. a bit heavier) and at least the games seems to work just like before (beside bench like 3dmark2000 with strange polygons rendering) and games works ok maybe a bit slower than I thought beside the MiniGL games running very fast I probably was expecting too much, I suppose the latest P-MMX is not enough even for the 8MB single model and the 66Mhz 430VX system is at its limit. I might try with the K6-3 400 anyway. Unfortunately the board can't let two Voodoo II and even a single one is quite long and too close to jumpers, capacitors and the cpu heatsink in front of most PCI slots.

About the final period of that company even nowdays I remember the "sad" feelings reading reviews and comments on their market situation becoming with latest products and choices a very difficult position. Considering the Voodoo II was still sold in the final STB owned period, considering the Voodoo3 as a sort of "boosted Banshee", considering the VSA-100 as a very late "what should have been the Voodoo3" if built maybe in a later version @ 0,18um to get at least 200Mhz default clocks and without the memory sync freq problem.. it was an impossible situation. While I respect all the glory the first Voodoo Graphic has/had.

Reply 3 of 15, by TrashPanda

User metadata
Rank l33t
Rank
l33t
386SX wrote on 2022-02-09, 10:16:
TrashPanda wrote on 2022-02-09, 00:31:
Its a Diamond Voodoo 2 always install the official diamond monster Voodoo2 drivers first then install the official ones over the […]
Show full quote

Its a Diamond Voodoo 2 always install the official diamond monster Voodoo2 drivers first then install the official ones over the top, I own a pair of 12mb Monster IIs and have always had issues if I dont install the official diamond drivers first. Diamond did some stupid shit with their cards which the official drivers have issues with, to this day even with all my testing I have yet to figure out what exactly they did but I'm guessing its a registry entry or setting. Its dang annoying too as you wont notice issues till you try to use the Voodoo control panel or windows flakes out and you get odd compatibility issues.

Not even the unofficial community drivers can get around Diamond Voodoo 2 weirdness.

Also you will find the final 3DFX Voodoo 2 drivers were buggy as shit, I dont think they were even finished before 3DFX shut down but hey thats par for course with 3DFX, they left the scene with both the Voodoo 5 5500 and V5 6000 in an incredibly buggy unfinished state, they never did fix the SLI issues.

But that all part of the charm really, I still love 3DFX and Glide.

Thank you, this's interesting and confirm my memories about having problems with reference drivers and these cards. I've installed the latest drivers on a fresh config (without updates, IE6.0, Dx9 etc.. that seems to make the o.s. a bit heavier) and at least the games seems to work just like before (beside bench like 3dmark2000 with strange polygons rendering) and games works ok maybe a bit slower than I thought beside the MiniGL games running very fast I probably was expecting too much, I suppose the latest P-MMX is not enough even for the 8MB single model and the 66Mhz 430VX system is at its limit. I might try with the K6-3 400 anyway. Unfortunately the board can't let two Voodoo II and even a single one is quite long and too close to jumpers, capacitors and the cpu heatsink in front of most PCI slots.

About the final period of that company even nowdays I remember the "sad" feelings reading reviews and comments on their market situation becoming with latest products and choices a very difficult position. Considering the Voodoo II was still sold in the final STB owned period, considering the Voodoo3 as a sort of "boosted Banshee", considering the VSA-100 as a very late "what should have been the Voodoo3" if built maybe in a later version @ 0,18um to get at least 200Mhz default clocks and without the memory sync freq problem.. it was an impossible situation. While I respect all the glory the first Voodoo Graphic has/had.

If they had another 6 months im sure they would have worked their problems out and we would have seen the Rampage come to light, Rampage essentially being the VSA+ with T&L but they just didn't have the cash and nVidia made an offer they had to take.

Im 100% sure most of Rampage got rolled into nVidias next GPU along with SLI.

Reply 4 of 15, by 386SX

User metadata
Rank l33t
Rank
l33t
TrashPanda wrote on 2022-02-09, 10:36:

If they had another 6 months im sure they would have worked their problems out and we would have seen the Rampage come to light, Rampage essentially being the VSA+ with T&L but they just didn't have the cash and nVidia made an offer they had to take.

Im 100% sure most of Rampage got rolled into nVidias next GPU along with SLI.

I often wonder if maybe the last train was already lost at the Voodoo3 release. Beside the OEM market, the spec of the Avenger chip couldn't imho have a sense in that period and as discussed in the past the 32bit marketing was a powerful one and a difficult thing to "not have" beside how much useful in that period. I mean how many chips already had that even before as a compatibility feature; I can't see how could have worked from a marketing strategy to remain limited to 16bit and explaining it. But the problems were also the multimedia part as a feature to at least talk about. S3 and even Trident, SiS (not to mention ATi) already focused into that while the Avenger chip seems like didn't care a lot on the multimedia part too (beside the 3500 TV another choice that I never understood what was the point to even design).

Reply 5 of 15, by TrashPanda

User metadata
Rank l33t
Rank
l33t

Their last few years were a little odd no doubt I guess they sat on their asses a little after the Voodoo2 which put them behind both ATI and nVidia in development, and I never understood the point of TV tuner on a 3d card.

Reply 6 of 15, by 386SX

User metadata
Rank l33t
Rank
l33t

Usually I read that there were also some without TV tuner but for the consumer market I never seen one in the stores. It would have had more sense to just release that instead of the Voodoo3 3000 into that layout with only the TV out but probably the 183Mhz capable chips were not many at 0,25um and ram cost also was a limit too. Too see things in prospective nowdays it looks like everything was wrong. Ram size, rendering color depth, multimedia features, self built cards.. at least the Avenger chips should have been detached from the memories freq sync to be pushed a bit more by factory without having the ram limitation (important but not that much, just considering the original Geforce2 MX specs) and maybe using an already common fan/heatsink cooling system.
And about the TV tuner what was the point to put their best few clocked chips into a (probably expensive) TV tuner 3D high end card.. the concept of the TV tuner on a 3D card while not the first one, felt like not the most useful idea already but to have the fastest chip used into that what was the point I'll never understand. The few feature the Avenger chip had didn't make sense along with a TV tuner. If that was the fastest card let's release it as soon as possible as a videogame oriented card and instead their next chip VSA-100 was clocked again @ 166Mhz.

Last edited by 386SX on 2022-02-09, 12:56. Edited 1 time in total.

Reply 7 of 15, by TrashPanda

User metadata
Rank l33t
Rank
l33t

I feel that teh Voodoo3 was delayed too, it should have been released 6 - 8 months earlier than it was which would have made it make sense in the market of the time where 16bit rendering would have been more than fine.

Reply 8 of 15, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
386SX wrote on 2022-02-09, 10:54:

I often wonder if maybe the last train was already lost at the Voodoo3 release. Beside the OEM market, the spec of the Avenger chip couldn't imho have a sense in that period and as discussed in the past the 32bit marketing was a powerful one and a difficult thing to "not have" beside how much useful in that period.

I can attest to 32-bit color being hyped up by all the major outlets at the time. Heck, it was the reason why I (mistakenly) bought a TNT2 instead of a Voodoo3 back in the day. On the other hand, many cards which offered 32-bit color rendering did so with a non-trivial performance penalty, especially at higher resolutions. So it was a trade off.

Like Phil says in one of his videos, the Voodoo4 is what the Voodoo3 should have been. But it wasn't ready yet, so they had to make due.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 9 of 15, by 386SX

User metadata
Rank l33t
Rank
l33t

I think the Voodoo3 shouldn't have never been called.. Voodoo "3". The expectations the Voodoo Graphic and the Voodoo II / SLI created for the 3D gamers/consumer market was so high that imho the specs of the Avenger chip could have been right as Voodoo2 SLI alternative surely earlier but for a short time while designing a new architecture that could have had that brand "Voodoo3". Or if that was the only design in the development at that time, it should have been called Banshee II and in that way it could have had a middle-end market place and sense.
But still, looking around every other companies were accelerating and fast. Even if many were still lacking final frame rate speed, the features were already there in most single chips and it should have been "easy" to understand what consumers were waiting for. The last thing to do was probably to take the Virge chips road, refreshing and refreshing a similar architecture basically taking time I suppose.
Time has been the main factor, months that were important like years in that specific tech period.

Reply 10 of 15, by TrashPanda

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2022-02-09, 13:00:
386SX wrote on 2022-02-09, 10:54:

I often wonder if maybe the last train was already lost at the Voodoo3 release. Beside the OEM market, the spec of the Avenger chip couldn't imho have a sense in that period and as discussed in the past the 32bit marketing was a powerful one and a difficult thing to "not have" beside how much useful in that period.

I can attest to 32-bit color being hyped up by all the major outlets at the time. Heck, it was the reason why I (mistakenly) bought a TNT2 instead of a Voodoo3 back in the day. On the other hand, many cards which offered 32-bit color rendering did so with a non-trivial performance penalty, especially at higher resolutions. So it was a trade off.

Like Phil says in one of his videos, the Voodoo4 is what the Voodoo3 should have been. But it wasn't ready yet, so they had to make due.

Sadly even when it did release the VSA 100 was woefully underpowered and even the Voodoo 3 3000 could keep up with it, was not till the Voodoo5 5500 that the VSA 100 could flex its muscle as the scalable architecture it was designed to be. I own a Voodoo 4 4500 and its not till you overclock it that it really comes alive but you need to add better cooling to the card than it released with.

Reply 11 of 15, by 386SX

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2022-02-09, 13:00:
386SX wrote on 2022-02-09, 10:54:

I often wonder if maybe the last train was already lost at the Voodoo3 release. Beside the OEM market, the spec of the Avenger chip couldn't imho have a sense in that period and as discussed in the past the 32bit marketing was a powerful one and a difficult thing to "not have" beside how much useful in that period.

I can attest to 32-bit color being hyped up by all the major outlets at the time. Heck, it was the reason why I (mistakenly) bought a TNT2 instead of a Voodoo3 back in the day. On the other hand, many cards which offered 32-bit color rendering did so with a non-trivial performance penalty, especially at higher resolutions. So it was a trade off.

Like Phil says in one of his videos, the Voodoo4 is what the Voodoo3 should have been. But it wasn't ready yet, so they had to make due.

I agree that the Voodoo"4" 4500 should (and imho could technologically) have been the real Avenger chip in the original release timeline. No SLI was needed beside some alternative high end extreme version cause it sound like to design a "race car" using two "motorcycle engines" that might perform fast but still not a good concept after all when analyzed as a single engine features/speed and if that single engine design isn't also a completely new architecture and more an updated someone at some point will probably ask why not to design a single new engine.
Anyway the "Voodoo" brand and the Glide API had to be left to the past at some point soon or late. ATi understood imho that changing from Rage brand to Radeon brand and with a very smart and time correct move but their previous architecture suffered only from a driver point of view, while the chips were capable and always updated to the period. 32 bit could have been useless I suppose none cared really about it but it should have been obvious that reviews couldn't be done when testing 32 bit capable games and if a "gamer card" can't be tested on it all the explanations would try to overcome that obvious limit. So most reviews were talking about that to explain to the readers why the speed graphs were missing the frame rates values in 32 bit tests.. a serious marketing complex situation I suppose.

Last edited by 386SX on 2022-02-09, 13:43. Edited 6 times in total.

Reply 12 of 15, by TrashPanda

User metadata
Rank l33t
Rank
l33t

32bit rendering was like T&L, Pixel Shaders, Unified Shaders and then RT and AI after that, none of it is really needed at the time it was developed but you can tell which companies failed to see the direction things were going by their absence in the industry today, its the main reason AMD added RT functionality to RDNA2 and will hopefully add Matrix AI cores to RDNA3.

AMD needs their Tensor cores, if they fail to bring them to RDNA3 they will be facing getting left behind as pure raster speed will only take you so far when AI is capable of accelerating everything more efficiently.

Reply 13 of 15, by 386SX

User metadata
Rank l33t
Rank
l33t

I suppose nowdays the consumer vga market situation is so complex that (even if I don't follow much the modern video card sector news maybe just waiting for the lowest end market products) every video cards with any architecture would be a success beside their features.. probably a S3 Virge in PCI-EX with modern o.s. drivers would still sell. 😁

Last edited by 386SX on 2022-02-09, 13:58. Edited 1 time in total.

Reply 14 of 15, by TrashPanda

User metadata
Rank l33t
Rank
l33t
386SX wrote on 2022-02-09, 13:35:

I suppose nowdays the consumer vga market situation is so complex that (even if I don't follow much the modern video card sector news maybe just waiting for the lowest end market products) every video cards with any architecture would have be a success beside their feature.. probably a Virge in PCI-EX with modern o.s. drivers would still sell. 😁

Honestly .. I wouldn't mind a Verge reboot, build me a rock solid 2d accelerated card that has basic Direct x / Vulcan support for Windows/Linux and I would buy one, got a good number of machines that such a card would be perfect for as they never get used for gaming but still have trash tier slow GPUs in them that never work right under Linux.

Reply 15 of 15, by 386SX

User metadata
Rank l33t
Rank
l33t

I'm not much interested in modern videogaming cards too, maybe I just would have liked to test some modern rendering engines to see the modern rendering quality but once I decided to look for a "modern" middle-end video card prices became what they are even for very old video cards. So I don't even care anymore but I could in future buy a very low end modern gpu to boost a bit old still usable everyday home computers with some modern features and compatibility.