VOGONS


Are Voodoo graphics card THAT good ?

Topic actions

Reply 100 of 183, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Only for the Voodoo it's an extension (WGL_3DFX_gamma_control). Otherwise, it'll call SetDeviceGammaRamp (and XF86VidModeSetGamma on Linux (which btw was regressed for at least a decade and i've had to put up with so much grief over this regression it's not funny))

apsosig.png
long live PCem

Reply 101 of 183, by Fusion

User metadata
Rank Member
Rank
Member

I love my Voodoo 3. Got it overclocked nice and high and cooled with a 120mm fan. 😎 It's the PC I use the most. great era of gaming, 1998-2001.

Pentium III @ 1.28Ghz - Intel SE440xBX-2 - 384MB PC100 - ATi Radeon DDR 64MB @ 200/186 - SB Live! 5.1 - Windows ME

Reply 102 of 183, by guest_2

User metadata
Rank Newbie
Rank
Newbie

I'll resurrect this thread.
As already stated, the Voodoo 3 was an incredible card and perfect for ~1997-2001 gaming.
Quake II for the first time in 1024 x 768 without needing SLI or SLI combined with a 2D card and running at 80fps!? Unheard of in 1997

Reply 103 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t

It has been discussed many times I suppose. Anyway even back in the early 2000's years I had the same opinion and it's far from epic about the 3dfx story. As someone linked there's a Youtube conference video with older executive/employees where has been asked their important opinion on this subject and if I understood correctly some seems to agree on the point that the STB decision was the point of no return not for the idea itself (ATi did their own cards or at least I don't know if someone actually manufactured for them anyway they were high end PCB built cards and not depending on the third party manufacturer.. and they are still somehow there with the Radeon line beside now AMD obviously).
I'd agree on the opinion that is more the myth about 3dfx than the real objective successes. The Voodoo1 was the best, most innovative cards when compared to its time and the alternatives.. the Voodoo2 was "still" a good idea and even faster and the SLI was like having a 2x performance boost for future upgrade. BUT it was at the limit of that period.. add-on cards were already out of time from the first part of the 90's, and considering how much important video cards were already it could not be considered still a "video card" itself..
What came later is really a painful memory.. the Voodoo3 first self-built card came late and probably those months spent on that passage to self-build their cards I imagine might have not permitted to spend resources/time on a ready new next generation chip and basically ended up as a boosted Banshee card.
As said above, it was a fast one obviously but still a optimized, all-in one concept of the previous ones. The 32bit marketing was just as powerful as the later features DX7 introduced even if basically no differences were seen most of the cases but was always underlined by reviews in a time people read carefully these changes and understood that alternative cards were almost just as fast more or less and with more features.
With the Directx7, the time of Glide API came to end and even developers imho left that to some few titles still supporting it and at that point OpenGL too was competing and T&L became what 32bit feature was years before that.
The VSA-100 for the third time felt like another refreshed logic and seeing a Voodoo4 4500 competing with a Geforce2 MX (the original groundbreaking version) was an example.
So it looks like they did what they could pushing the (more or less same/similar) architecture to a limit where the alternatives were running so fast that the only step that imho might have years before saved them might have been continuing only selling the chips, putting all the resources in a next gen GPU without "Voodoo architecture" forever. The Voodoo3 was a powerful name/brand and it should have been obvious people had many expectations from such name to end up in a "Banshee II" instead of a REAL Voodoo"3" maybe built later but on a next gen idea concept. Following DirectX evolution might have been the right way. But time was the most important factor probably back in those few years.

Reply 104 of 183, by gerry

User metadata
Rank Oldbie
Rank
Oldbie
guest_2 wrote on 2021-05-06, 22:44:

I'll resurrect this thread.
As already stated, the Voodoo 3 was an incredible card and perfect for ~1997-2001 gaming.
Quake II for the first time in 1024 x 768 without needing SLI or SLI combined with a 2D card and running at 80fps!? Unheard of in 1997

I'd agree for 1997 (and 1998!)

when AGP started to happen more and more and especially after relatively good value cards such as TNT2 became available in 1999 (and were equal to the games of 1999) the 'special status' afforded voodoo was no longer that special for a person simply wanting to play a late 90's 3d game, in my opinion.

their place as trailblazers and leaders in the graphics is assured of course, i also think their prices are grossly over the top now. Just look at their sold prices on ebay, if i was starting out in vintage computers and interested in late 90's gaming I'd use that money to get a whole system with agp circa 2000 instead and experience (just about..) any game just as well if not better

Reply 105 of 183, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

In my opinion, the most amazing thing about the Voodoo cards (and I'm referring to Voodoo 1/2/Banshee/3 in particular), was their ability to breathe new life into an otherwise very slow & aging platform (mostly thanks to the Glide API).
For example, in 1999, me and most of my friends had very slow CPUs (Pentium MMX or, at best, AMD K6-2), and I remember that, during the Summer of 1999, one friend in particular upgraded his PC from a Pentium MMX 166 to an AMD K6-2 450 & Riva TNT2 (which was amazing to me anyway, because at that point I was still stuck with an ATI Rage IIC, one of the worst "3D" cards to ever exist). Anyway, another friend of ours had a slower CPU (I believe it was a K6-2 300) with a Voodoo Banshee (which, objectively, was a slower, older and budget-oriented card compared to the new Riva TNT2)... however, virtually all games ran MUCH better, at very playable framerates on this older and (what should have been) slower PC, while most games struggled on my friend's new PC.
At that point in time I didn't understand why this was the case, it really seemed like "Voodoo magic" to me, but, yeah... as it turns out, lower CPU overhead was what made the Voodoo card seem magical. 😀

Some time ago, for this purpose, I specifically put together a PC with a Pentium MMX 233 OC @ 292 MHz & Voodoo 3 2000 PCI... and it's impressive! It's unbelievable how many games can run at decent, playable framerates @ 1024 x 768 on this system: Quake 2 - 50 FPS, Quake 3 (with a bit of tuning): 32 FPS, Unreal (mostly 30/40+ FPS with some dips to 15 FPS), Need for Speed 3 (unsure what FPS, but smooth as butter), Need for Speed 4 (slower than NFS3, with intermittent slowdowns, but still VERY playable). This might not seem like much nowadays, but at that point in time 15 FPS was very acceptable to me, 30 FPS would've been awesome and 50 FPS... insane! 😁

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 106 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t

Sure I agree that for even older platform most of the early solution were great included the Banshee. I wish in fact back in the 1998/99 have bought a Banshee instead of a Voodoo3 2000 for my K6-2 350, that performed slower that it could with such cpu and for the cost a Banshee was already more than enough, but also a Voodoo1/2 considering I was playing at 640x480 or 800x600 usally.
But after that the Voodoo3 should have never been released that late with such name at all IMHO. The Voodoo3 "had to be" a next generation video chip, with higher core freqs, with all the features other chips already had 32bit for example but not only, let's think to Motion Compensation for MPEG2 decoding, hardware texture compression, AGP usage, some features that showed their advance in R&D like the EMBM did for Matrox, the S3TC for S3 and later the T&L for Nvidia.
And instead the Avenger chip felt like a boosted, polished, optimized and die shrink Banshee. Also if the speed was the main target, the 3000 model had no sense while the 183Mhz point should have been the only version and maybe a lower end 125>143Mhz too at lower price. The chip was already heating a lot at lower freq, mine version could not pass the 181Mhz absolute limit, maybe for the memory model but anyway far from a factory clocked target freq. But still it'd not have had much sense a 16bit, 256x256 texture chip in the 1999 from a marketing point of view.

Reply 107 of 183, by rasz_pl

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2021-05-07, 10:48:

For example, in 1999, me and most of my friends had very slow CPUs (Pentium MMX or, at best, AMD K6-2), and I remember that, during the Summer of 1999, one friend in particular upgraded his PC from a Pentium MMX 166 to an AMD K6-2 450 & Riva TNT2

He got suckered by AMD 🙁. Celeron 300A was available from the middle/late 1998, and sold slightly cheaper than AMD K6-2 300MHz. At the end on 1998 it was already $90. Cheapest motherboards with 440BX chipsets started around $140 in 1998, came down to $120 in 1999. Comparable price to "good" SS7 motherboard, "good" as in not freezing/glitching as often as the "bad" ones but still crap when it comes to stability and AGP. Upgrading to 300A + 440BX motherboard + SDRAM while selling old cpu/motherboard/ram combo was equal money if not cheaper, resulted in better computer, and thats before we even think about overclocking 300A with few strips of insulating tape.

386SX wrote on 2021-05-07, 11:43:

with all the features other chips already had 32bit for example

Lets not forger 3dfx 16 bit wasnt like other vendors 16bit, it was "22-bit like"
Nvidia 16bit looked really bad, even TNT2 Re: Struggling to see difference between 16 bit and 32 bit colours
Push for 32bit was typical Nvidia scam, like tesselation going down to many triangles per pixel https://techreport.com/review/21404/crysis-2- … f-a-good-thing/, or when Nvidia "helped" developers drop DirectX features it didnt like, like Assasin Creeds removal of DX10.1 after joining Nvidia "meant to be played" program https://techreport.com/news/14707/ubisoft-com … oversy-updated/. GameWorks slowing down ATI/AMD users by up to 50% https://arstechnica.com/gaming/2015/05/amd-sa … -3-performance/ https://blogs.nvidia.com/blog/2015/03/10/the-witcher-3/ etc etc. 32Bit was just a ruse used by Nvidias in a war against 3dfx.

386SX wrote on 2021-05-07, 11:43:

Motion Compensation for MPEG2 decoding

Intel 450MHz was enough to decode mpeg2 fully in software, while I remember seeing 300MHz P2 with RAGE Pro still dropping frames. I would love to see performance comparison between different chip vendors video decode acceleration solutions from mid/late nineties, like this Video for CPUs - The Software Decode Reference Thread but expanded concentrating on GPUs.
Imo it was marketing wank at this point unless you made a mistake of buying AMD and DVD drive (most likely costing more than your cpu and motherboard combined) in 1998 with the expectation of watching movies.

386SX wrote on 2021-05-07, 11:43:

hardware texture compression

it did, SST-1 aka Voodoo1 supported "8-bit narrow channel YIQ(4-2-2)", Voodoo3 added FXT1 https://en.wikipedia.org/wiki/FXT1, slightly uglier at same compression level as S3TC. Afaik nobody ever used FXT1 in commercial game, I also never heard about games shipping with YIQ support. Would be interesting project to retro backport YIQ support into GLQuake 😉
Btw Nivida didnt supoport any compression until Gerforce, and even that had ugly 16 bit accuracy interpolation - quality comparable to FXT1.
It was simply too early for compression, there were no standards, everyone was trying to promote their own patented crap, S3TC pretty much stole Apple road pizza video codec algorithm etc. It was finally Microsoft decision to license S3TC for DirectX6 that cemented a standard.

386SX wrote on 2021-05-07, 11:43:

AGP usage

why tho? i740 showed it was too early and interface was too slow, as soon as you started fetching textures over AGP performance tanked

386SX wrote on 2021-05-07, 11:43:

and later the T&L for Nvidia

Thats what killed 3dfx, Nvidia committed to 6 month design cycle and steamrolled competition

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 108 of 183, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
rasz_pl wrote on 2021-05-09, 11:23:

He got suckered by AMD 🙁. Celeron 300A was available from the middle/late 1998, and sold slightly cheaper than AMD K6-2 300MHz. At the end on 1998 it was already $90. Cheapest motherboards with 440BX chipsets started around $140 in 1998, came down to $120 in 1999. Comparable price to "good" SS7 motherboard, "good" as in not freezing/glitching as often as the "bad" ones but still crap when it comes to stability and AGP. Upgrading to 300A + 440BX motherboard + SDRAM while selling old cpu/motherboard/ram combo was equal money if not cheaper, resulted in better computer, and thats before we even think about overclocking 300A with few strips of insulating tape.

Well, not quite... In my country (and in my city in particular) things were a "little" bit different. Intel parts/combos (even Celerons) were EXTREMELY expensive and completely out of reach for most people, not to mention that the Celeron 300A was probably unavailable most of the time - we only had a few small local retailers and the PC parts they had for sale were very limited.

I myself upgraded to an AMD K6-2 500 MHz in the Spring of 2000 (yes, I was poor), and I distinctly remember a few very specific things:
- the cheapest Intel combo alternative was almost twice the price that I paid for the K6-2 500 combo.
- all the AMD options were at the top of the A4 page (being cheaper/sorted by price in ascending order), while all Intel parts were at the bottom (there were no Athlon parts in my neck of the woods yet).
- the most expensive Intel combo on that price list was based on a 750 MHz Coppermine and had an absolutely INSANE price (I think somewhere around 10x - 15x the price I paid for my K6-2 combo).

Other than that, yeah, I agree. The Celeron 300A would've been the deal of the century...
Still, I loved my K6-2 CPU, and even though I had a very cheap motherboard (Fastfame with the Ali Aladdin V chipset), I never had AGP issues... at least not with my Riva TNT2 M64. I did have some RAM compatibility issues when I bought a new 128MB SDRAM PC133 module, which I eventually had to replace with a PC100 module that finally worked properly.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 109 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t
rasz_pl wrote on 2021-05-09, 11:23:
Lets not forger 3dfx 16 bit wasnt like other vendors 16bit, it was "22-bit like" Nvidia 16bit looked really bad, even TNT2 Re: S […]
Show full quote
386SX wrote on 2021-05-07, 11:43:

with all the features other chips already had 32bit for example

Lets not forger 3dfx 16 bit wasnt like other vendors 16bit, it was "22-bit like"
Nvidia 16bit looked really bad, even TNT2 Re: Struggling to see difference between 16 bit and 32 bit colours
Push for 32bit was typical Nvidia scam, like tesselation going down to many triangles per pixel https://techreport.com/review/21404/crysis-2- … f-a-good-thing/, or when Nvidia "helped" developers drop DirectX features it didnt like, like Assasin Creeds removal of DX10.1 after joining Nvidia "meant to be played" program https://techreport.com/news/14707/ubisoft-com … oversy-updated/. GameWorks slowing down ATI/AMD users by up to 50% https://arstechnica.com/gaming/2015/05/amd-sa … -3-performance/ https://blogs.nvidia.com/blog/2015/03/10/the-witcher-3/ etc etc. 32Bit was just a ruse used by Nvidias in a war against 3dfx.

386SX wrote on 2021-05-07, 11:43:

Motion Compensation for MPEG2 decoding

Intel 450MHz was enough to decode mpeg2 fully in software, while I remember seeing 300MHz P2 with RAGE Pro still dropping frames. I would love to see performance comparison between different chip vendors video decode acceleration solutions from mid/late nineties, like this Video for CPUs - The Software Decode Reference Thread but expanded concentrating on GPUs.
Imo it was marketing wank at this point unless you made a mistake of buying AMD and DVD drive (most likely costing more than your cpu and motherboard combined) in 1998 with the expectation of watching movies.

386SX wrote on 2021-05-07, 11:43:

hardware texture compression

it did, SST-1 aka Voodoo1 supported "8-bit narrow channel YIQ(4-2-2)", Voodoo3 added FXT1 https://en.wikipedia.org/wiki/FXT1, slightly uglier at same compression level as S3TC. Afaik nobody ever used FXT1 in commercial game, I also never heard about games shipping with YIQ support. Would be interesting project to retro backport YIQ support into GLQuake 😉
Btw Nivida didnt supoport any compression until Gerforce, and even that had ugly 16 bit accuracy interpolation - quality comparable to FXT1.
It was simply too early for compression, there were no standards, everyone was trying to promote their own patented crap, S3TC pretty much stole Apple road pizza video codec algorithm etc. It was finally Microsoft decision to license S3TC for DirectX6 that cemented a standard.

386SX wrote on 2021-05-07, 11:43:

AGP usage

why tho? i740 showed it was too early and interface was too slow, as soon as you started fetching textures over AGP performance tanked

386SX wrote on 2021-05-07, 11:43:

and later the T&L for Nvidia

Thats what killed 3dfx, Nvidia committed to 6 month design cycle and steamrolled competition

It wasn't easy to attach to the "22bit train" if people actually could only "see" 16bit or 32bit in games; also I remember testing the "22bit option" and trying running Quake3 and the differences were really far from visible. The 32bit internal tech of the Kyro2 compared to that was like another planet. I'd agree that most of these features AGP included were tech advances that didn't really make differences at least not in the short period. But still marketing, strategy, vision, are things that companies at that level should anticipate also considering how many players for better or not, were out there at the second half of the 90's.
I agree that AGP at that time had not much sense beside maybe some initial idea that things didn't go as fast as they did in game complexity, textures size, etc..etc.. at that point the usage of local video memory was certainly more obvious than before considering how fast they did become and all the bandwidth race after the DDR tech etc..
About the Motion Compensation, it is true that a Pentium II high end would be enough but not entirely imho to have a functional multitasking o.s. and we're still talking about low requirements o.s. like W98. I tried many solutions even hardware decoders PCI ones I still own, at the end the best tech on gpu was the one on ATi chipset that since the Rage 128 Pro and Rage Mobility introduced both the Motion Compensation and IDCT and along with PowerDVD or WinDVD or the original ATi DVD Player, gave some great result even on those low freq cpu, with cpu usage going down to 30% when usually Motion Compensation only went down only to 60/70% cpu usage if not more.
Hardware dvd decoders were the best at least the only that got to find a good balance between compatibility, great video decoding quality, features.. and I'm talking about the Hollywood+ Sigma cards. Unfortunately LCD screens were about to become a thing and together with even more powerful cpu/gpu the dedicated hardware lost its sense to have compared to the downside (vga pass cable degradation signal and or the VGA output itself of the card that needed a strong VGA itself like the Matrox ones imho. But in the 2000 to not have at least the Motion Compensation feature felt like really staying on the cheap side. At the end even more powerful cpu would benefit of offloading those tasks while in desktop because not always a DVD was seen fullscreen and often in a window on a corner. That was heavy on cpu if it wasn't for those early gpu accelerators on this side.

But still going back to 3dfx, the decision of building their own card had sense if done at first when still competitors were still understanding how to move into that 3D fight.. not later.. when the Voodoo3 came out and read the reviews (and bought one) it was clear that probably the point of no return was already passed if a new chip wasn't released as soon as possible. When the VSA came out I felt like it was the end.

Reply 110 of 183, by Carrera

User metadata
Rank Member
Rank
Member

As a former 3Dfx beta tester the whole 32- it thing as a pain in the butt public relation wise.
Nvidia had 32 bit and people were taking shots at 3DFX for it.
At the time though, it didn't make much difference visually.
Which was a problem because 3Dfx didn't address it fast enough (there were other factors such as STB not betting up to speed fast enough in general).

Is it me or did the 16/22/24/32 bit color thing happen about the time when there was a huge take up on benchmarking in general?

Personally I didn't see a bit difference between in my Voodoo2 SLI and Voodoo 3 or when I finally went Nvidia I left the Voodoo2s in and switched between 32 bit and 16 bit... I personally didn't see much difference but then again I love people purely for their inner values and personality and not looks....

Oh just a second, Pam Andersen is calling...
😁

Reply 111 of 183, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
Carrera wrote on 2021-05-10, 12:38:

Is it me or did the 16/22/24/32 bit color thing happen about the time when there was a huge take up on benchmarking in general?

Yes, that time when screenshots were rare and explaining why 3dfx 16-bit screenshots do not capture real quality was for the top 1 % geeks.

Reply 112 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t

Imho it wasn't only the 32bit color "problem".. most of the cards of that era were good for gaming, we could say even the VSA-100 and their SLI solution.. the point was that I suppose if a company is big and healthy enough that permit the R&D ipotethical department to design a new generation GPU, maybe they can skip one generation race and focus on the next one. It makes me remember the NV30 vs NV40 GPUs differences.
But probably I always imagined that the time factor had a huge impact on loosing the last train.

Then we can everyone agree on the fact 16bit or 32bit has never been a problem like nowdays is not the whole FullHD or 4K games resolution or 120fps discussions that nobody should cares.... only few games and few in-game moments could show the benefit of that and they were not that big. Sure there were games where 32bit had more impact like very dark games with many light effects. But then you add that to the lack of full MPEG2 decoding stages, full AGP 1x/2x/4x usage, textures size, higher core and mem freqs, lack of some DX7 features.. at the end I suppose people began to look around and see others were already ahead or did take other roads like professional markets.
At the end there were many companies that had good solution and closed one way or another the video card business back in the history. It was a difficult time I suppose, let's see what happened on the game console market with the Saturn vs PlayStation situation for example.. One wrong design step and things became difficult in such competitive markets.

Reply 113 of 183, by douglar

User metadata
Rank Oldbie
Rank
Oldbie

The Voodoo 4 was late and under performed. If it had come out in Q1 2000, things might not have been too bad, but instead it came out 6 months after ATI's Rage, a full year after the Geforce 256, and the target market was getting salted by the Geforce 2MX. 3dfx acquired the over head of of manufacturing plants when they acquired STB . Those factories needed to keep going for 3dfx to survive. They didn't have the OEM contracts that ATI had to keep the factories going through a lean year. So when the voodoo 4 was late and low end, the STB manufacturing ground to a halt, and they ended up being an anchor that pulled them underwater instead of increasing efficiency.

https://www.anandtech.com/show/641/13

With a street price of around $150, the Voodoo4 4500 is going to be a very hard sell. As a $100 or sub-$100 card, the Voodoo4 4500 does have some justification behind it, unfortunately in many cases it is simply too little too late for 3dfx.

The Voodoo4 4500 would have been perfect around the introduction of the Voodoo3 3500TV or before NVIDIA brought the GeForce2 MX to market, however now that you can pick up a GeForce2 MX with TwinView support for around the same price, or one without the feature for a little less, the Voodoo4 4500 definitely loses its appeal.

While we criticized ATI's introduction of the Radeon SDR at the $150 price point, it definitely offers more bang for your buck than the Voodoo4 4500 does.

The problem with the Voodoo4 4500 is mainly that it's lacking in the fill rate department, and while overclocking would be able to help solve that we weren't able to push the card far enough as a 5% overclock was the highest we could achieve. It would take much more to get the Voodoo4 4500 to the point where it could compete in the fill rate department.

While 3dfx could theoretically outfit the Voodoo4 with hand picked 183MHz VSA-100 chips, it doesn't make sense for them to do that now. Their concentration should most definitely be on getting the next product out while keeping their head above water for now, it's definitely been a very bumpy ride for the company that at one point was the undisputed king of the 3D accelerator world.

Reply 114 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t

Interesting article quote and I'd agree imho with that. Also to compare the 4500 with the Radeon SDR is not fair for the last. The R100 in the SDR version still was a great quality PCB card with a huge next-gen GPU, passive heatsink, a great DVD decoding engine, their own TV-OUT chip, and even great for old computer too. In fact I think it's the best I'd put on a K6-2/3 build.
The Voodoo4 4500 in the way we knew it, it might have been a "corrected" Voodoo3 if released exactly when the Voodoo3 was intended to be on the market, but with that 183Mhz target point I was also talking about even thinking some way to detach the sync beetween the chip freq and the mem freq that was absurd at that point.
At that point they could have still sold the Voodoo2 reloaded version or some boosted Banshee (V3 2000) with another name (Banshee II for ex.) for low prices. The reason to build their own cards had no point anymore (there was no time imho) and then focus totally on a new generation GPU leaving the multi-chip ideas to the 90's where they had to stay..

Reply 115 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t

And anyway many time I come back to what is said in this good interview linked after the "1:49:32 seconds" with the direct questions and clear answers of what happened..

3dfx Oral History Panel @ Computer History Museum YouTube channel:

https://www.youtube.com/watch?v=3MghYhf-GhU

And also the question @ "2:24:39 seconds" it's interesting.

Reply 116 of 183, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

Awesome interview, indeed. I've seen it multiple times in the past years.
It always makes me sad, though, when hearing Gordy Campbell's words at 1:54:01. You can really tell that he's still very upset about 3dfx's untimely demise. Especially because he clearly states that he was against this decision to acquire STB. If only the other board members would've listened to him...

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 117 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2021-05-10, 19:02:

Awesome interview, indeed. I've seen it multiple times in the past years.
It always makes me sad, though, when hearing Gordy Campbell's words at 1:54:01. You can really tell that he's still very upset about 3dfx's untimely demise. Especially because he clearly states that he was against this decision to acquire STB. If only the other board members would've listened to him...

Yeah it seems they agree on the PCB manufacturer wasn't the best idea. In my opinion it might have been if it was a company like ATi that somehow did that since early 90's and continued in the early 2000's and following on both sides, with the "Built By" and "Powered by" logic and this would have make a good question (along with other questions I sure have asked with such long time available, it was interesting to hear a history of the company but let's go to the best parts that should have been focused more on the last years).
Listening the interview seems to put things in the right prospective and it's easy to understand how much a single decision that might take months to complete, might have been the worst ever or the best ever.. I think it's clear that 90's was a fast and enthusiast era for new company to enter the market like probably never will be anymore, not specifically in the 3D one but in many others too.
I think I heard a part where it was said the Voodoo3 was known internally as Banshee, I might have heard wrong but even if might be a wrong memory that would not surprise me convincing me of the point said before about the Voodoo3 as the time saving post-PCB acquisition fastest card to build that never should have been called Voodoo3 at all.
But imho the important words are after the 1:49:32 when it more or less specified the "time factor" to find themself late for the next generation product "..but ultimately we ran out of the time". What I still don't understand is how was possible to invest in project like the V5 6000 that might have taken many people and resources to engineer and yet not arriving at a final reference card, instead of trying to survive however was possible even through pushing their architecture with some modern "Turbo" concept basically overclocking.

Reply 118 of 183, by cyclone3d

User metadata
Rank l33t++
Rank
l33t++
rasz_pl wrote on 2021-05-09, 11:23:
bloodem wrote on 2021-05-07, 10:48:

For example, in 1999, me and most of my friends had very slow CPUs (Pentium MMX or, at best, AMD K6-2), and I remember that, during the Summer of 1999, one friend in particular upgraded his PC from a Pentium MMX 166 to an AMD K6-2 450 & Riva TNT2

He got suckered by AMD 🙁. Celeron 300A was available from the middle/late 1998, and sold slightly cheaper than AMD K6-2 300MHz. At the end on 1998 it was already $90. Cheapest motherboards with 440BX chipsets started around $140 in 1998, came down to $120 in 1999. Comparable price to "good" SS7 motherboard, "good" as in not freezing/glitching as often as the "bad" ones but still crap when it comes to stability and AGP. Upgrading to 300A + 440BX motherboard + SDRAM while selling old cpu/motherboard/ram combo was equal money if not cheaper, resulted in better computer, and thats before we even think about overclocking 300A with few strips of insulating tape.

Ehhhh... I went from an AMD 5x86-133 that I had overclocked to 160 to an AMD K6 something and then to a K6-2.. eventually topping out with a K6-2 550 that I overclocked to 660Mhz on an ASUS P5a though I am pretty sure I didn't have it overclocked like that all the time due to it running so hot.

Wasn't any games I wanted to play that didn't run great on that system. I actually never had an Intel system between a 486-DX2 66 and a C2Q-6600. It was all AMD and it was all fun.

Yamaha modified setupds and drivers
Yamaha XG repository
YMF7x4 Guide
Aopen AW744L II SB-LINK

Reply 119 of 183, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

Did some testing with MAGESLAYER.

PIII 500 / RageProLT and PCX1
K6 450 / RageIIC and PCX2
P1-233 MMX / MGA + Voodoo1

Man the Voodoo1 runs so smooth with the glide-supported gamedrivers of Mageslayer.
But as shown on various comparisions on youtube...the deep colors are mostly gone when using 3dfx.

Last edited by dr.zeissler on 2021-05-11, 07:48. Edited 1 time in total.

Retro-Gamer 😀 ...on different machines