VOGONS


Are Voodoo graphics card THAT good ?

Topic actions

Reply 140 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t

I remember newspaper components prices and anyway differed a lot from the real stores one most of the times. Obviously each store at the end make their own price.
About the reviews quality, oh well I think nowdays it's not like things are better. Some reviews are ultra simplified and not giving much details like even who wrote some of that didn't know or other times are ultra complicated to even read without giving many possibilities to really "understand" new features, tech etc.. ok that nowdays for example GPU are more complicated than in the late 90's but back then you could almost understand how they worked without being some internal engineer working on that, just using the right balance beetwen in-depth details and simplified explanations. I still prefer the old style of the old hardware reviews.

Reply 142 of 183, by Shishkebarbarian

User metadata
Rank Newbie
Rank
Newbie
386SX wrote on 2021-05-11, 13:33:
appiah4 wrote on 2021-05-11, 13:27:

GeForce2MX was my card after the Voodoo 3 as well, although mine was the MX400. One year later I upgraded to a GeForce 2 Ti and passed my MX to my sister. She played many hours of Dungeon Keeper 2 on it.

After the Geforce2 MX I remember I read a lot about the Kyro II and what it meant as a new player on the market so I had to have one once they began to be a bit cheaper and I still own that one too. Old 16bit games basically really could looks like 32bit ones and the best example was the game Thief II. I remember it was immediately visible in the dark enviroment of that game how much every light had all the colors and no classic 16bit effects "problems". Not that it was that important at that time but still was an impressive thing along with their tech itself, I already went for the Athlon 1100 too and then the Athlon XP 1800+. Later cards were the Radeon 8500 LE and the Radeon 9500 Pro"L" memory. 😁
But my upgrade passion was about to end with the switch to x64 tech. The last PC I upgraded as modern one has been an Athlon 64 3500 with a Radeon X1800 XL. It was a fast middle end machine when I built that and yet many games had difficulties running smooth at high details.. I sold that config soon and switched to old notebooks, then netbooks, then collecting retro hardware and still nowdays I use retro hardware for daily machine. 😀

i liked reading your history of upgrades here. follows similar to my own history (386, P133, P450, XP1700+, Athlon 64 3500). I have since had a Phenom II and an i5 and now building a frankenstein i7. So did you completely abandon modern games? I agree that new tech is expensive and 'middle-end' isn't what it used to be. I'm building around a buddy's old GTX970 which came out in 2014 and is still $250ish second hand. I do mainly function in the "retro" 1993-2002 sphere but modern games are pretty great too and worth a try here and there

My GPU history is Voodoo 3 3000, GFIII Ti 200, GF4 Ti 4200, <some forgettable Radeon Card>, Geforce 7800 GT, Geforce 9600 GT and then basically nothing as i more or less quit modern PC upgrading. I'm now working on a "new" PC with parts from my various friends' old builds (i7,GTX970)

appiah4 wrote on 2021-05-11, 13:36:

From the GF2 Ultra I also went on to the Radeon 8500LE, and from that I moved on to 9600PRO, the 9800SE (softmodded to PRO), X1600PRO, HD3850, HD4850, HD7700 and now an RX480 8GB.

Go #TeamRed

Go #TeamGreen ! =D

Reply 143 of 183, by Shishkebarbarian

User metadata
Rank Newbie
Rank
Newbie
rasz_pl wrote on 2021-05-11, 17:06:
Now that you said it It sure looked weird, so I googled :) Pizza Hut 1994 flyer Exchange 1:23000, average monthly salary in 199 […]
Show full quote
bloodem wrote on 2021-05-11, 11:49:

$10 for a pizza before the turn of the century, in Eastern Europe (I presume Poland by your nickname)? Damn, that must've been quite a large Pizza. 😁

Now that you said it It sure looked weird, so I googled 😀
Pizza Hut 1994 flyer
Exchange 1:23000, average monthly salary in 1994 5-6 millions, $220-270.
large supreme + Coleslaw + bottle of cola = $10, and where is my garlic bread? 🙁

Oh my that was one expensive pizza, 4% of average monthly income !!! Hard to imagine it was that popular.

Reply 144 of 183, by Shishkebarbarian

User metadata
Rank Newbie
Rank
Newbie
bloodem wrote on 2021-05-11, 18:10:

No, my hometown was Galați, Romania.

I've been reading through this whole thread and the last couple pages have been so much fun to read! it took me back to the 90s and all the fun conversations i had in the school yard. I actually guess you were from Romania based on your earlier hint. Odessa, Ukraine reporting in, though by that time we had already emigrated to "Little Odessa" in USA

Reply 145 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t
Shishkebarbarian wrote on 2021-07-29, 14:02:

i liked reading your history of upgrades here. follows similar to my own history (386, P133, P450, XP1700+, Athlon 64 3500). I have since had a Phenom II and an i5 and now building a frankenstein i7. So did you completely abandon modern games? I agree that new tech is expensive and 'middle-end' isn't what it used to be. I'm building around a buddy's old GTX970 which came out in 2014 and is still $250ish second hand. I do mainly function in the "retro" 1993-2002 sphere but modern games are pretty great too and worth a try here and there

My GPU history is Voodoo 3 3000, GFIII Ti 200, GF4 Ti 4200, <some forgettable Radeon Card>, Geforce 7800 GT, Geforce 9600 GT and then basically nothing as i more or less quit modern PC upgrading. I'm now working on a "new" PC with parts from my various friends' old builds (i7,GTX970)

Lately I've built this Core2 E8600 dual 3,3Ghz machine with 8GB DDR3 and SSD. Considering how much old the config is, I still find it more than usable in most tasks so I was thinking, what if I try to upgrade the GPU I've now a GT610 with something modern and a Q9650 cpu to make it run sort of like a low end modern machine? More for some tech demos and benchmark than games but maybe some cheaper games might still be interesting to run.
So I was looking at modern old GPU prices but obviously after decades when I decide to upgrade something it is now the worst period ever to buy a GPU probably in history (?) and I find even R7/R9 gpu having prices that should make someone laugh considering how much old they are and the same happens with older GTX cards. So I'd like to push this config to the maximun with Win 8 considering I've low gaming expectations and 1080p@30fps for me is already a great results. I finished old games @ 20fps, if I think Doom on the Am386DX-40 even at lower fps.. so it might have some sense even if most suggest me to upgrade the others components before.
But at that point I'd spend a lot and still having an integrated gpu that I suppose would not be much faster.

Reply 146 of 183, by silikone

User metadata
Rank Member
Rank
Member

No matter how good Voodoo's color depth filter is, it will never make up for its lack of quality texture formats. In fact, the better the output quality, the more quantized textures are going to stick out.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 147 of 183, by leileilol

User metadata
Rank l33t++
Rank
l33t++

"32-bit is a scam" are some famous last words.

One could post 3dm01 clouds, JK2 logos/JA broken videos, unreadable Max Payne panels, any UE2 game, many lith/idtech3 games,etc. all fun stuff to compare with those "32-bit scam" cards of 1999. 😀

You know what IS a scam? "22-bit" 😉 filtering the dithering matrix down with visual loss doesn't make more bits of visual information. If you really want the best 16-bit image output, you'd get a PowerVR card 😀

apsosig.png
long live PCem

Reply 149 of 183, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote on 2021-07-29, 19:43:

"32-bit is a scam" are some famous last words.

One could post 3dm01 clouds, JK2 logos/JA broken videos, unreadable Max Payne panels, any UE2 game, many lith/idtech3 games,etc. all fun stuff to compare with those "32-bit scam" cards of 1999. 😀

You know what IS a scam? "22-bit" 😉 filtering the dithering matrix down with visual loss doesn't make more bits of visual information. If you really want the best 16-bit image output, you'd get a PowerVR card 😀

Yeah, 32 bit was definitely not a scam. Just like ray tracing now, it wasn't really usable with the first generations of cards that supported it. However, in a matter of just a few years (especially after the GeForce 3 / Radeon 8500 were released), 32bit would become the de-facto standard. Before that, the performance hit when going from 16 bit to 32 bit had been pretty high, but once these cards launched... the performance drop became negligible.

And the lack of 32 bit support was not even the worst thing: the lack of high resolution texture support (before the VSA-100) was an even bigger issue for 3dfx, IMO.
By early 2000, there were already a few popular games that had high resolution texture support (Quake 3, Need for Speed 5 Porsche, Expendable), and I remember seeing side by side image quality comparisons between the GeForce 256 and the Voodoo 3... it was just a joke! It didn't even seem fair to compare framerates between the two, given how bad the image quality looked on the Voodoo 3 - and yet, still the GeForce 256 was the much faster card. 😀

3dfx committed suicide because of very bad management (STB acquisition) + lack of true innovation (well, I guess this also counts as bad management). The Voodoo 3 is basically a souped-up Banshee, the Banshee is a more integrated/cheaper Voodoo 2 with higher clocks, and the Voodoo 2 is nothing more than a direct evolution from Voodoo 1 (mostly benefiting from faster memory, the process node shrink which enabled higher clocks + the addition of a secondary TMU).
The first few generations of Voodoo cards were truly impressive, but relying on the same technology for far too long (especially in a time of fierce competition) can't be good for (future) business.
Somehow 3dfx only thought about the present, not the future. So we do need to give credit where credit is due: nVIDIA and ATI were a lot smarter in this regard.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 150 of 183, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
leileilol wrote on 2021-07-29, 19:43:

If you really want the best 16-bit image output, you'd get a PowerVR card 😀

Do Kyro cards support palleted textures and table fog?

You got me kinda curious about using them for 16-bit only Win9x games.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 151 of 183, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
bloodem wrote on 2021-07-30, 06:21:

Somehow 3dfx only thought about the present, not the future.

I wouldn't say so. Rather they cared so much about having the fastest chips it wrecked execution.

Reply 152 of 183, by rasz_pl

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2021-07-30, 06:21:

Yeah, 32 bit was definitely not a scam. Just like ray tracing now, it wasn't really usable with the first generations of cards that supported it. However, in a matter of just a few years (especially after the GeForce 3 / Radeon 8500 were released), 32bit would become the de-facto standard. Before that, the performance hit when going from 16 bit to 32 bit had been pretty high, but once these cards launched... the performance drop became negligible.

so it "only" took nvidia 4 generations to deliver practical 32bit 😉

bloodem wrote on 2021-07-30, 06:21:

And the lack of 32 bit support was not even the worst thing: the lack of high resolution texture support (before the VSA-100) was an even bigger issue for 3dfx, IMO.
By early 2000, there were already a few popular games that had high resolution texture support (Quake 3, Need for Speed 5 Porsche, Expendable), and I remember seeing side by side image quality comparisons between the GeForce 256 and the Voodoo 3... it was just a joke! It didn't even seem fair to compare framerates between the two, given how bad the image quality looked on the Voodoo 3 - and yet, still the GeForce 256 was the much faster card. 😀

https://images.anandtech.com/old/video/voodoo … extures-ati.jpg
https://images.anandtech.com/old/video/voodoo … textures-v3.jpg

Undeniable. Still this seems kind of weird considering those images are < 256x256 3dfx limit.

I seem to recall reading about hacked drivers (Amigamerlin?) emulating bigger texture support by dividing geometry down in flight, akin to what some Playstation 1 games were doing in order to mask affine texture distortions. Seems it was just an idea thrown around, but never implemented.
In many ways 3dfx is similar to Commodore, both kept innovation to the bare possible minimum if not less.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 153 of 183, by appiah4

User metadata
Rank l33t++
Rank
l33t++
rasz_pl wrote on 2021-07-30, 07:30:
bloodem wrote on 2021-07-30, 06:21:

Yeah, 32 bit was definitely not a scam. Just like ray tracing now, it wasn't really usable with the first generations of cards that supported it. However, in a matter of just a few years (especially after the GeForce 3 / Radeon 8500 were released), 32bit would become the de-facto standard. Before that, the performance hit when going from 16 bit to 32 bit had been pretty high, but once these cards launched... the performance drop became negligible.

so it "only" took nvidia 4 generations to deliver practical 32bit 😉

I keep pointing out and nobody listens. You really need perspective on how 32-bit performed until say GeForce2. It was basically a bullet point feature and bragging right. Yes it looked better but performed like ass.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 154 of 183, by Carrera

User metadata
Rank Member
Rank
Member

I beta tested for 3Dfx and had regular contact with the engineers.
I can pretty much confirm most of what is being said here.
3Dfx missed the 32 bit train, though to be honest it was a little silly in the beginning because while some card could do 32 bit a lot of people stick to 16 bit as it had a better frame rate. Posting screenshots in 32 bit but only getting 20-30 fps was a bit odd in my mind.
I also failed to see why a lower bit would harm your enjoyment of the game. I mean a red Ferrari is cooler but is it really a deal breaker when it is yellow instead?

At some point though it did become a standard and 3Dfx not being to keep up was a death knell.

They were hemorrhaging money like crazy. I know for a fact that lunches costing 5 figures were not uncommon.

The STB thing could have worked if they had the tech side of things cornered but like all things driven by greedy board members they thought it would work out quickly and they could catch up but innovation is a sly mistress .....
I mostly was working on the multi-media stuff. Voodoo TV (remember that?) and improving the 3500 drivers.
In some ways they were a little ahead of the curve because their capture capabilities were not too bad and people would like have shared recorded stuff *cough* legally *cough* but when they refused to acknowledge a major bug in the drivers I knew something was up...

In many ways, 3Dfx was just a precursor to what happened in tech at large. It just seems to hurt more when it was the pioneer that goes down first...

Reply 155 of 183, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
rasz_pl wrote on 2021-07-30, 07:30:

so it "only" took nvidia 4 generations to deliver practical 32bit 😉

Yeah. I would say that GeForce 2 GTS (or even the GeForce 256) was the turning point.
Although these cards still suffered from a 30 - 40% performance hit when going from 16 bit to 32 bit, their raw power still allowed for extremely playable performance at very high resolutions (1024 x 768 x 32 or even 1600 x 1200 x 32).

However, even with cards from previous generations, like the TNT2, there were still quite a few games where 32 bit was perfectly usable (like Quake 2 and games based on the Q2 engine, which were perfectly playable at 800 x 600 x 32 or even 1024 x 768 x 32, not to mention games like the Tomb Raider series and many others like them, which were capped at 30 FPS anyway, so at 1024 x 768 x 32 you would get the exact same performance but with a much better image quality).
Also, let's not forget that, overall, "playable performance" in the late 90s is not comparable to "playable performance" in 2021. I would have killed to play games even at 20+ FPS in 1999, but my ATI Rage IIC.. kind of disagreed with me on that 😀.
The whole idea was that, if someone wanted to sacrifice performance for better image quality, they had the option to do so with an nVIDIA/ATI card. And in certain cases, they didn't even have to sacrifice any performance, they just got the better image quality out of the box.

Voodoo cards, on the other hand, had their own advantages (which I think I already mentioned in this thread), so that many people with weaker CPUs were able to enjoy games that they would have never been able to (decently) play with an nVIDIA or ATI card. That's why, to this day, people love them so much (including me).

Carrera wrote on 2021-07-30, 08:17:

3Dfx missed the 32 bit train, though to be honest it was a little silly in the beginning because while some card could do 32 bit a lot of people stick to 16 bit as it had a better frame rate. Posting screenshots in 32 bit but only getting 20-30 fps was a bit odd in my mind.

Everything seems silly in the beginning, it's how technology evolution has always worked. Bikes & cars seemed silly when they were invented. Even movies seemed silly. 😀 And the list can go on...

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 156 of 183, by appiah4

User metadata
Rank l33t++
Rank
l33t++
bloodem wrote on 2021-07-30, 08:21:

Everything seems silly in the beginning, it's how technology evolution has always worked. Bikes & cars seemed silly when they were invented. Even movies seemed silly. 😀 And the list can go on...

Some things seem silly until the end though. Like motion controls and VR goggles.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 157 of 183, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
appiah4 wrote on 2021-07-30, 08:40:

Some things seem silly until the end though. Like motion controls and VR goggles.

Also true 🤣

I remember all the negative reviews regarding the first generation of nVIDIA RTX cards, and how everyone called Ray Tracing "just a gimmick that kills performance"...
Now, even though I didn't care about Ray Tracing in any way, I did end up buying an RTX 2060 (simply because my GTX 760 was long overdue for an upgrade). And boy, was I lucky to do that and avoid all the inflated pricing that we have today.

Well, as it turns out, even with the performance penalty, Ray Tracing has been perfectly usable for me with this card, especially with DLSS 2.0. Finishing Metro Exoddus Enhanced edition on Ultra quality with RT & DLSS has been thoroughly enjoyable (at 50 - 60FPS). Same with Doom Eternal, Control, and a few others. It's not worth enabling in every game, it's not usable in every game (*cough* Cyberpunk *cough*), but it's still great in some.

Anyway, I digress... 😀

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 158 of 183, by 386SX

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2021-07-30, 06:21:
Yeah, 32 bit was definitely not a scam. Just like ray tracing now, it wasn't really usable with the first generations of cards t […]
Show full quote
leileilol wrote on 2021-07-29, 19:43:

"32-bit is a scam" are some famous last words.

One could post 3dm01 clouds, JK2 logos/JA broken videos, unreadable Max Payne panels, any UE2 game, many lith/idtech3 games,etc. all fun stuff to compare with those "32-bit scam" cards of 1999. 😀

You know what IS a scam? "22-bit" 😉 filtering the dithering matrix down with visual loss doesn't make more bits of visual information. If you really want the best 16-bit image output, you'd get a PowerVR card 😀

Yeah, 32 bit was definitely not a scam. Just like ray tracing now, it wasn't really usable with the first generations of cards that supported it. However, in a matter of just a few years (especially after the GeForce 3 / Radeon 8500 were released), 32bit would become the de-facto standard. Before that, the performance hit when going from 16 bit to 32 bit had been pretty high, but once these cards launched... the performance drop became negligible.

And the lack of 32 bit support was not even the worst thing: the lack of high resolution texture support (before the VSA-100) was an even bigger issue for 3dfx, IMO.
By early 2000, there were already a few popular games that had high resolution texture support (Quake 3, Need for Speed 5 Porsche, Expendable), and I remember seeing side by side image quality comparisons between the GeForce 256 and the Voodoo 3... it was just a joke! It didn't even seem fair to compare framerates between the two, given how bad the image quality looked on the Voodoo 3 - and yet, still the GeForce 256 was the much faster card. 😀

3dfx committed suicide because of very bad management (STB acquisition) + lack of true innovation (well, I guess this also counts as bad management). The Voodoo 3 is basically a souped-up Banshee, the Banshee is a more integrated/cheaper Voodoo 2 with higher clocks, and the Voodoo 2 is nothing more than a direct evolution from Voodoo 1 (mostly benefiting from faster memory, the process node shrink which enabled higher clocks + the addition of a secondary TMU).
The first few generations of Voodoo cards were truly impressive, but relying on the same technology for far too long (especially in a time of fierce competition) can't be good for (future) business.
Somehow 3dfx only thought about the present, not the future. So we do need to give credit where credit is due: nVIDIA and ATI were a lot smarter in this regard.

Similar to what I think too about what might be happened in those times. About the whole 32bit frame rate problem imho it wasn't the point, the point was that had to be there at that time; I undestand that for a while the company might have convinced that their 16bit rendering was better than the other ones but the whole tech was going in THAT direction, game developers too, no matter how much useless it was at the beginning but even in the above review graphs we can see 10-20fps speed hit more or less on the 32bit cards not like something impossible to accept.
Also people were used to play games even @ 20fps but more important it was just like T&L, something that showed the world how fast and tecnologically impressive video cards were becoming on paper following a standard API and at that point it was all about Directx also thank to the impact that 3DMark meant imho. If they had "a reason" to not go for 32bit rendering they should have given the consumers something more than just some frame rate and the Glide compatibility already at the end of its time. But at that point competitors on the market already filled the chips with those other features and I can't imagine how from a marketing point of view, the whole answer about "we still have a similar Banshee chip but much much faster and not much more" might have worked from a consumer point of view. ATi itself designed the best solution for video decoding and video decoding was developed since quite some time from the MPEG1 times. But at the same time not only focusing on that but trying chip after chip, driver after driver, to release better chips every times. The Rage 128 Pro was quite a good one and the Radeon was a great chip with all their knowledges into a huge package.
The VSA solved partially the missing things of the V3 but when? As already been said, a single VSA competing late with the Geforce2 MX or a R100 SDR was a pain even to read on reviews...

Last edited by 386SX on 2021-07-30, 10:52. Edited 1 time in total.

Reply 159 of 183, by appiah4

User metadata
Rank l33t++
Rank
l33t++

Nobody's defending 3dfx's complacency; however the feasibility of 32-bit color gaming a non-issue until GeForce 2 onwards. The fact is, 3dfx did not lose because Voodoo 3 did not have 32-bit rendering. No, to the contrary Voodoo 3 sold by the boatloads against the TNT2/Rage128. They lost because they showed up with nothing against GeForce/Radeon and Voodoo 4/5 were too late and too slow compared to GeForce 2. Lacking 32-bit with Voodoo 3 had nothing to do with it whatsoever.

Retronautics: A digital gallery of my retro computers, hardware and projects.