VOGONS


Are Voodoo graphics card THAT good ?

Topic actions

Reply 160 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t

It's just a user opinion of course but I still think in that specific period the real origin of the beginning of the end was instead the Avenger chip, not necessary for the 16bit but for the whole offer. 😉
I understand they needed to sell some cards to probably complete the next chips but still there was no reason why the VSA came out and felt like the missed Voodoo3. And why oh why to call it Voodoo3! The only other similar situation I remember was the Geforce FX release story but that lasted only one generation and the amount of FX 5200 sold and the optimized later high end cards probably compensate the situation (after a long release list of previous impressive successes) but without the NV40, the cards on the table might have been really similar to those times. The 32bit sure wasn't such a needed feature until it became simply ordinary to have forgetting the past 16bit, even the fixed T&L wasn't considering how powerful cpu became later, but still was the road that everyone had to take to build a new chip or to change market sector as many did. The Voodoo3 meant time.. time to probably wait to stable 183Mhz chips (@ 250nm forever and ever..), time to wait for cheaper 183Mhz SDR ram modules... the Voodoo3 3500 that late.. with TV.. if there were some without the TV I've never seen them around in stores and not even in local newspaper reviews.
To say that there's a point of no return and even in the interview said above it's talked about that.. what came later is just a proof of what happened before imho.

Reply 161 of 183, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
appiah4 wrote on 2021-07-30, 10:51:

Nobody's defending 3dfx's complacency; however the feasibility of 32-bit color gaming a non-issue until GeForce 2 onwards. The fact is, 3dfx did not lose because Voodoo 3 did not have 32-bit rendering. No, to the contrary Voodoo 3 sold by the boatloads against the TNT2/Rage128. They lost because they showed up with nothing against GeForce/Radeon and Voodoo 4/5 were too late and too slow compared to GeForce 2. Lacking 32-bit with Voodoo 3 had nothing to do with it whatsoever.

Again, this is not just about 32 bit support. It's about lack of innovation, it's about not giving your users the choice of higher performance vs higher image quality... and, yeah, as you said, overall complacency. 😀
Also, I just mentioned in a previous post multiple games where 32 bit was in fact possible and 100% playable even with Riva TNT2 cards. Try Tomb Raider 2, 3, The Last Revelation, Chronicles and many other adventure games of the time which had a 30 FPS cap, and you'll see what I mean. There was literally NO performance hit at 1024 x 768 x 32 (which was a HUGE resolution), while the games looked much better compared to 16 bit (or, ehm, "22 bit" 😁 ).

I remember very well what me and my friends were thinking at the time, and literally NONE of us were thinking about FPS. As a matter of fact, FPS was never mentioned in our conversations. It was merely a matter of "does that game run well or not? does it stutter or not?". And here's the kicker - "stuttering" to us meant that the game would drop below 10 - 15 FPS. Anything above 20 FPS was generally very playable to us.
However, what I distinctly remember, is us gasping in awe at the latest game screenshots - like the Max Payne leaked screenshots from 2000 and saying that "it looks so insanely good that one day, in future games, WE will be Max Payne". 🤣 Crazy high school idiots...
Seriously, I myself was willing to sacrifice as much performance as possible, just so that I could play a game on very high details. And to this day, I do the same. I'm solely interested in single player games, don't really care for multiplayer, battle royale and all that crap. So to me performance comes second, quality first. When I was young, 20 - 30 FPS was more than enough performance for me. Now I do prefer at least 40 - 45 FPS... so there's that. 😀

Last edited by bloodem on 2021-07-30, 17:11. Edited 1 time in total.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 162 of 183, by Scooby D

User metadata
Rank Newbie
Rank
Newbie

3Dfx was in the right place at the right time.
The glide api was quickly adopted by developers and and a lot of publicity with tomb raider, quake and the likes.
For those reasons 3dfx glide has the advantage of compatibility for the games of that era. Other than that is just another api.

Reply 163 of 183, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Voodoo 3 had 16 Mb of VRAM when all competitors had 32 Mb, 16-bit color only vs 32-bit for all competitors, no trilinear filtering, supported up to 256x256 textures only, no stencil buffer, no solid OpenGL driver on start, crappy passive cooling on all cards or none at all. To add the "cherry on top" - Voodoo 3 were really overpriced, compared to feature rich competitors offerings. Makes you wonder why 3dfx didn't flopped before VSA100, with such "killer product". But oh well, power of Glide was still a thing, I suppose.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 164 of 183, by silikone

User metadata
Rank Member
Rank
Member

I always saw Voodoo's strength in the software. Compatibility in an age of API immaturity and low overhead on systems that still had crippled floating point capability was crucial. Old benchmarks really show how horribly little some cards did to ameliorate CPU cycles.
Glide was fortunately very similar to OpenGL, so that would have aided them keeping things consistent.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 165 of 183, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2021-07-30, 14:04:

Voodoo 3 had 16 Mb of VRAM when all competitors had 32 Mb, 16-bit color only vs 32-bit for all competitors, no trilinear filtering, supported up to 256x256 textures only, no stencil buffer, no solid OpenGL driver on start, crappy passive cooling on all cards or none at all. To add the "cherry on top" - Voodoo 3 were really overpriced, compared to feature rich competitors offerings. Makes you wonder why 3dfx didn't flopped before VSA100, with such "killer product". But oh well, power of Glide was still a thing, I suppose.

What? 16 MB is rarely limiting. Trilinear filtering is of course supported and passive cooling is an advantage.

Reply 166 of 183, by leileilol

User metadata
Rank l33t++
Rank
l33t++
appiah4 wrote on 2021-07-30, 07:57:

I keep pointing out and nobody listens. You really need perspective on how 32-bit performed until say GeForce2. It was basically a bullet point feature and bragging right. Yes it looked better but performed like ass.

The G400, Savage4/2000 and TNT2 Ultra weren't that bad at 32-bit, and then there's the no-32bit-frame-loss-at-all PowerVR Neon250.

Last edited by leileilol on 2021-07-30, 18:41. Edited 1 time in total.

apsosig.png
long live PCem

Reply 167 of 183, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

What? 16 MB is rarely limiting.

Yeah, we all know how it ended for 3dfx with their marketing gimmicks.

Trilinear filtering is of course supported

Trilinear filtering is supported only on paper. In reality, only dithering is available for user. Same thing for TNT cards, but they had other stuff supported.

and passive cooling is an advantage.

Not if your chips can heat up like devil's armpits. It was justified for ATi and Matrox chips, but not for Avenger.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 168 of 183, by leileilol

User metadata
Rank l33t++
Rank
l33t++
The Serpent Rider wrote on 2021-07-30, 18:41:

Trilinear filtering is supported only on paper.

Are you sure about that?

Attachments

apsosig.png
long live PCem

Reply 169 of 183, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Yes, I am aware that you can force trilinear without multitexturing.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 170 of 183, by appiah4

User metadata
Rank l33t++
Rank
l33t++
leileilol wrote on 2021-07-30, 18:40:
appiah4 wrote on 2021-07-30, 07:57:

I keep pointing out and nobody listens. You really need perspective on how 32-bit performed until say GeForce2. It was basically a bullet point feature and bragging right. Yes it looked better but performed like ass.

The G400, Savage4/2000 and TNT2 Ultra weren't that bad at 32-bit, and then there's the no-32bit-frame-loss-at-all PowerVR Neon250.

I have a soft spot for G400 but its OpenGL performance STANK for most of its viable lifetime. As for TNT2 Ultra, I kindly disagree that 32-bit was a feasible gaming fidelity for this card. I know nothing about Neon250 but I guess that must be the same tech as what was in Kyro? Kyro was very special but that was in 2000. If PowerVR had a similar chip in 1999, I did not know about it.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 171 of 183, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
leileilol wrote on 2021-07-30, 18:40:

The G400, Savage4/2000 and TNT2 Ultra weren't that bad at 32-bit, and then there's the no-32bit-frame-loss-at-all PowerVR Neon250.

I had a TNT2 (not Ultra) back in the day, and using 32-bit color in any resolution above 640x480 incurred a non-trivial performance hit.

Contemporary benchmarks like this one showcase it quite well.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 172 of 183, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2021-07-30, 18:41:
Yeah, we all know how it ended for 3dfx with their marketing gimmicks. […]
Show full quote

What? 16 MB is rarely limiting.

Yeah, we all know how it ended for 3dfx with their marketing gimmicks.

Trilinear filtering is of course supported

Trilinear filtering is supported only on paper. In reality, only dithering is available for user. Same thing for TNT cards, but they had other stuff supported.

and passive cooling is an advantage.

Not if your chips can heat up like devil's armpits. It was justified for ATi and Matrox chips, but not for Avenger.

You are simply wrong on all these points.

leileilol wrote on 2021-07-30, 18:40:

The G400, Savage4/2000 and TNT2 Ultra weren't that bad at 32-bit, and then there's the no-32bit-frame-loss-at-all PowerVR Neon250.

Neon 250 does not have the bandwidth for "free" 32 bit. Perhaps you mean the chip itself does not suffer slowdowns, but that is common for the time. Chip of Voodoo3 does everything at true color as well.
Rage 128 should have a honourable place as 32-bit enabler.

Reply 173 of 183, by silikone

User metadata
Rank Member
Rank
Member
Joseph_Joestar wrote on 2021-07-30, 20:21:
leileilol wrote on 2021-07-30, 18:40:

The G400, Savage4/2000 and TNT2 Ultra weren't that bad at 32-bit, and then there's the no-32bit-frame-loss-at-all PowerVR Neon250.

I had a TNT2 (not Ultra) back in the day, and using 32-bit color in any resolution above 640x480 incurred a non-trivial performance hit.

Contemporary benchmarks like this one showcase it quite well.

How did ATI keep doing it?

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 174 of 183, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie

The relevance of 32-bit colors seems to sway the topic. 😉

Voodoo1 was great, no doubt at its time of launch, even with all its crooks that it required a companion 2D card. The 3D quality & performance were unmatched at the gaming market that it addressed. The game industry responded overwhelmingly, many games supported, Glide was made the propriety API of choice and highly sought after. I could bet that every games that supported Glide would sell in no time. it was the time when SVGA 640x480 256-color was awed among the VGA 320x200 256-color. Wasn't VGA 256-color awesome compared to CGA/EGA? 16-bit colors (ie. 65536 colors wow!) was so great.

IMHO, Voodoo2 wasn't all that great. It was hyped and many simply took an easy upgrade from Voodoo1, a no-brainer. Outside of sheer performance and heat, there was nothing impressive feature wise. A desperate catch-up to single-chip Riva TNT with multi-texturing capability.

Voodoo 3/Banshee, it seemed 3Dfx had been focusing on 2D core to compete with NVIDIA/ATI/Matrox in the most financially lucrative PC OEM market, a market dominated by NVIDIA/ATI. S3 used to dominate at this market but their failed vision on PC 3D future and NVIDIA aggressiveness pushed them out of scene without mercy. Voodoo3/Banshee clinged to the legacy of Voodoo but competition had caught up -- Rage128, TNT2, G400, Savage4. Voodoo 3/Banshee remained great at what it did best, 16bit colors, 256x256 textures and Glide compatibility. But when 3Dfx clinging to their legacy for too long, the competition knew so well where to push the market against them.

Voodoo4 and beyond were utterly craps outside of being "the-now" priced collectibles.

Reply 175 of 183, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie
Putas wrote on 2021-07-30, 20:30:

Rage 128 should have a honourable place as 32-bit enabler.

I would place such honor to Matrox G400. The Matrox G400 MAX outperformed Riva TNT2 Ultra at 32-bit colors. For the short period of time, it was the highest performance 3D gamers' card money could buy for 32-bit colors. Matrox marketing cleverly maneuvered out of Voodoo 3 competition simply by ignoring 16-bit colors performance. They probably furnished hardware reviewers' sites with materials and game demos on 32-bit colors only.

NVIDIA wasn't all that strong a believer in 32-bit colors as their TNT2/Geforce competed well against Voodoo3 in 16-bit colors. Even later when ATI claimed 32-bit colors to be free on the Radeon series, NVIDIA countered that claim with "ATI Radeon weren't doing 16-bit colors right". A market leader as NVIDIA might have been, they did stumble upon future vision of 3D technology but were able to make a comeback against the competition. An example was unified shaders.

Reply 176 of 183, by Joakim

User metadata
Rank Oldbie
Rank
Oldbie

I agree voodoo 1 was amazing, a real game changer. Voodoo 2.. I know my family made the upgrade (because I have the card) but I can't remember it was anything special at the time or what computer it was paired with. We never got an other voodoo after that and the next one was the GeForce from what I can remember.

Tbh and with all due respect, I think the question in the topic is strange. A card that is over 20 years is not "good". It was good and it is at best very compatible with games from that era. Hunting 3d marks might be fun if that floats your boat, but the point for me is game compatibility.

Reply 177 of 183, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Joakim wrote on 2021-07-30, 21:39:

Tbh and with all due respect, I think the question in the topic is strange. A card that is over 20 years is not "good". It was good and it is at best very compatible with games from that era.

Uhm... the question is not 'strange', this is Marvin after all. 😀
All discussions and all questions are obviously about old hardware. Also, any adjectives that might be used revolve around the same paradigm.

Joseph_Joestar wrote on 2021-07-30, 20:21:

I had a TNT2 (not Ultra) back in the day, and using 32-bit color in any resolution above 640x480 incurred a non-trivial performance hit.
Contemporary benchmarks like this one showcase it quite well.

That is correct, but what people forget is that, even with the performance hit, most games were still extremely playable for the time (30+ FPS) even at 1024 x 768 x 32.
And then there were the games that were inherently capped at 30 FPS. So with those you got the better quality without the performance hit. Of course, those games were never used in any benchmarks... In fact, IMO, the benchmarks/reviews of the time were extremely lackluster, they mostly tested one or two popular games and... that was it.

And now, the real question: what made you buy a Riva TNT2 instead of a Voodoo 3? 😁
I myself upgraded my ATI Rage IIC to a Riva TNT2 M64 in the year 2000. Even though that card was... pretty bad, for me that still counts as the biggest jump in visual quality and performance that I've ever had with an upgrade. 😀

All in all, as mentioned before, 16 bit vs 32 bit is just a small part of the problem. I myself think that no high resolution texture support was the much bigger issue with Voodoo 3 cards. In fact, all these things combined accounted for the actual fundamental problem: lack of innovation and trying to milk the cow based on past success.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 178 of 183, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2021-07-31, 05:43:

That is correct, but what people forget is that, even with the performance hit, most games were still extremely playable for the time (30+ FPS) even at 1024 x 768 x 32.

Yup. I'm pretty sure I played Tomb Raider 2 at 800x600 with 32-bit colors back then. Quake 2 as well, but at 640x480. It was mostly in newer, more demanding titles like UT '99 and Star Trek Voyager: Elite Force that using 32-bit color depth became troublesome for me. But those were the games I was most interested in at the time.

And now, the real question: what made you buy a Riva TNT2 instead of a Voodoo 3? 😁

It was early 2000 and the Voodoo3 was already a year old. Also, my local computer magazines praised the TNT2 to high heavens, stating it was a "moderately priced but future proof solution". But after seeing how UT ran, I knew I should have gotten a Voodoo3 instead. Heck, my buddy with an S3 Savage2000 had frame rates comparable to my TNT2, but the game looked much nicer for him due to S3 Metal.

However, I did get a Voodoo3 sometime in 2002 for like 30 EUR. Mostly out of curiosity since I never had a 3DFX card before that. I didn't use it much back then, but I am getting a lot of mileage out of it now. Glad I kept it all these years.

I myself upgraded my ATI Rage IIC to a Riva TNT2 M64 in the year 2000.

I remember being blown away by the performance difference between my old Trident Blade 3D and the TNT2. In hindsight, that wasn't much of an achievement, since the Blade 3D was a super cheap, entry-level card.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi