VOGONS

Common searches


First post, by athlon-power

User metadata
Rank Member
Rank
Member

Not joking about this, I was looking up to see if my specs checked out with the kind of computer I was building (for the most part, they do), and I have ran into Gateway's ads for whatever computers they had at the time.

Multiple times, in different ads from different months, Gateway sells the TNT2 in higher-end setups, the Voodoo in lower end setups, and allows an "upgrade," to the TNT2 from a Voodoo 3.

https://books.google.com/books?id=SLTy_ ... mb&f=false

https://books.google.com/books?id=cr7PR ... mb&f=false

https://books.google.com/books?id=FWcSP ... mb&f=false

This is all Gateway, so maybe this was some idiotic concoction by their marketing team to sell the 32MB card as better because the number was bigger than 16MB. The problem I have with this is that at least with games that support Glide, performance is much greater on the Voodoo 3 than the TNT2. I have a TNT2 32MB, the real one, with a 128 bit memory bus, and I want a Voodoo 3 3000 because of Glide. The CPU load is reduced significantly, allowing it to do whatever the hell it wants, so boom, more performance. I'd hate to be the guy that was going to get one of the models with a Voodoo 3 in it and "upgraded," to the TNT2.

I get that the Voodoo 3 didn't have 32-bit color capability, but the cost of performance just for true 32-bit color seems a bit unnecessary. I don't know, maybe I'm talking about things I know nothing about, but I just found this odd and thought I'd share and maybe get some answers and/or figure out why I'm wrong on this.

Reply 1 of 26, by texterted

User metadata
Rank Newbie
Rank
Newbie

There's a lot of "rose tinted" hype with 3dfx. Sure, they were good for a while but it was a short while!

For me the Voodoo 2 was their pinnacle. Later cards on the AGP format that didn't actually use the AGP bus kind of shows the desperation as the competition caught up and quickly surpassed them.

Just my 2p's worth.

Reply 2 of 26, by athlon-power

User metadata
Rank Member
Rank
Member
texterted wrote on 2020-01-12, 23:59:

There's a lot of "rose tinted" hype with 3dfx. Sure, they were good for a while but it was a short while!

For me the Voodoo 2 was their pinnacle. Later cards on the AGP format that didn't actually use the AGP bus kind of shows the desperation as the competition caught up and quickly surpassed them.

Just my 2p's worth.

I can agree with that, from what I've seen, a PCI Voodoo 3 is no different in performance than its AGP counterpart, save for a small margin. What sounds ideal to me about the Voodoo 3 vs. the TNT2 is the lower FSB/CPU load, which has to count for something. The 32MB TNT2 is great, don't get me wrong- it's faster than even the TNT2 Pro 16MB I have- but I feel like I could really get some frames out of my current '99 build if I had a Voodoo 3 3000 or so slapped in there.

As far as the Voodoo 2's go, they are quite good, but for what I'd be looking for (at least right now), they wouldn't cut it. A single good Voodoo 3 will kick Voodoo 2 SLI ass if given the chance, and I've never really wanted one, but I do want a Voodoo for my Pentium MMX rig and a Voodoo 3 for the Pentium III.

I may want one at some point if I start a build that could use one, but I don't think I'd ever go with SLI. The Voodoo 2 probably pairs really well with some of the earlier Pentium II's, but once you get into 300MHz+ territory, the bottleneck becomes the Voodoo 2, not the CPU. I've seen a few videos on this, and it seems that, for the most part, performance improvement with Voodoo 2 SLI stops at around a Pentium II 266.

Reply 3 of 26, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Probably won't be better in a MVP3 system as VIA doesn't like real AGP parts, that's generally where 3dfx gets a pass for stability 😀

SLI's overrated and there's a lot of misinformation about its advantages, like assumptions it grants 24MB of texture memory or so when it's still effectively the same amount of texture memory as before with more of a bandwidth problem, especially when buffers are read/written and textures swap/thrash. A Voodoo3 doesn't have that issue.

by the way, DOSBox is not for running Windows 9x

Reply 5 of 26, by athlon-power

User metadata
Rank Member
Rank
Member

From what I'm seeing, Voodoo 2 SLI only holds up against a Voodoo 3 2000, anything higher than that, and it starts falling behind. I can't imagine what a 3500 would do to it, but it couldn't be pretty for the Voodoo 2's.

I think the moment 3DFX no longer had the market lead on 3D acceleration, they were doomed. A big reason why they used to kick ass so much in the Voodoo and early Voodoo 2 eras is because there was nobody else that could hold a candle to their performance, image quality, etc. (Voodoo v. Riva 128, for example).

When nVidia got wise and started with the TNT line, and ATi started gaining traction with the Rage 128, 3DFX was essentially having to deal with 3D accelerators that were gaining on their performance rapidly, then started showing superior image quality, etc. This isn't considering S3, which eventually got slightly wise, or Matrox, or the tonnes of video cards released, all of them kicking Voodoo and eventually Voodoo 2 ass, all while using non-propietary API's (or at least being compatible with non-proprietary API's).

However, Glide still manages to squeeze out more performance with less VRAM than a TNT2 32MB, albeit with higher core and memory speeds. I'm not sure what would happen if a Voodoo 3 3500 went up against a TNT2 Ultra.

Like I've said before, however, even the mid-range TNT2 32MB I have can move, and it's the most powerful "'90s" video card I've ever owned. Of course it has a '90s architecture, but was made in 2000, like every single TNT2 I own, except the M64 which was made in 2001- bleh.

Reply 6 of 26, by kolderman

User metadata
Rank Member
Rank
Member

IIRC it was occasionally faster than a 2000, slightly slower than a 3000, and not that far behind a 3500. I don't think there was a huge difference in v3 performance - the only difference was clock speed after all, and they were just a v2sli on a single card.

Reply 7 of 26, by foil_fresh

User metadata
Rank Newbie
Rank
Newbie

i think it's zoltan csar on youtube with a tnt2/voodoo3/g400 (200?) comparison video on a bunch of different games. some are better on the voodoo and some are better on the nvidia.

the tnt2 is definitely a better card overall due to 32 bit color, and in 1999 were there really many glide exclusive games still being made? the extra 16mb ram is also a clincher, sometimes high res is nice even on low fps.

im not fanboying nvidia or scolding 3dfx (i run a v3000 in my p3 system) im just looking at the facts.

if your business was selling a pc and the main goal of selling that pc is to make the customer think theyve made a right choice is give them the prettiest looking graphics possible rather than the uglier one with better performance. in those days 30fps was acceptable, mostly. thats probably the reason for it being the "top dog".

K6-2+ 550 / Riva 128 / HOT591-p / AWE64 / YMF744
P3 667 MHz / SY-7VBA / Voodoo 3000 / Diamond Monster MX300
Athlon XP 2200+ / SL-75FRN2-L / Radeon 9600 XT / Audigy Platinum eX
P4 3.2GHz / RIP MOBO / Geforce 6800 GT / Audigy

Reply 8 of 26, by DracoNihil

User metadata
Rank Oldbie
Rank
Oldbie

Didn't you need a TNT2 to even properly use the D3DDrv in Unreal 1 at the time? Or was it some other card I'm thinking of? While I understand when it comes to renderer subsystems, GlideDrv was the only absolutely superb choice for Unreal 1, but SoftDrv can run in 32-bit colour mode if your video card (not necessarily GPU) supported it, thus making designing maps a tad strange given how different GlideDrv renders things compared to SoftDrv.

Steam Profile
YouTube Channel
Seal of Nehahra

Reply 9 of 26, by leileilol

User metadata
Rank l33t++
Rank
l33t++

There's also the different lighting/texturing behavior GlideDrv does between single and multi-TMU voodoo cards (Banshee and V1 are brighter), and the factor that D3DDrv adds more detail texture stages than GlideDrv. Unreal graphics feature parity isn't had with any of the stock renderers, so they could never be fairly compared anyway (despite what prideful 3dfx fans want you to believe)

by the way, DOSBox is not for running Windows 9x

Reply 10 of 26, by appiah4

User metadata
Rank l33t
Rank
l33t

TNT2 offered 32-bit color depth at snail pace, Voodoo 3 and TNT2 were on par at 16-bit color depth but the Voodoo 3's front buffer selective filtering meant it had a much better picture. It was also compatible with a lot of Glide games the TNT2 had no chance of letting you play. Moreover, the Glide API in a lot of games like Unreal allowed you to play them at respectable framerates on slower CPUs. Overall, the Voodoo 3 was the winner IMO and I bought it, and never regretted it.

A500:Rev6|+512K|ACA500+|C1084S
i386:Am386SX25|4M|GD5402|ES688
i486:U5S33|8M|GD5428|YMF719
i586:P133|32M|T64V+/MX2|V1|CT3980/32M
i686:K6-2/500|256M|i740|V2/SLI|CT4520/32M
S370:P3-1200|384M|GF4-4200|MX300
S754:A3700+|2G|X800XTPE|SB0350

Reply 11 of 26, by Garrett W

User metadata
Rank Member
Rank
Member

TNT2 can be faster when used with faster CPUs, especially on parts like the Ultra. 32bit rendering, more VRAM, as well as support for larger textures meant that the card was more future-proof, although the lack of Glide support was obviously seen as a negative.

It really depends on the ads you saw, if the Voodoo 3 mentioned was the 2000 model and the TNT2 card was the Ultra model, then it sort of makes sense, doesn't it?

Reply 12 of 26, by appiah4

User metadata
Rank l33t
Rank
l33t
Garrett W wrote on 2020-01-13, 09:53:

TNT2 can be faster when used with faster CPUs, especially on parts like the Ultra. 32bit rendering, more VRAM, as well as support for larger textures meant that the card was more future-proof, although the lack of Glide support was obviously seen as a negative.

It really depends on the ads you saw, if the Voodoo 3 mentioned was the 2000 model and the TNT2 card was the Ultra model, then it sort of makes sense, doesn't it?

By the time CPUs that actually made 32-bit rendering mainstream were in the market the GeForce 2 MX was out, the TNT2 was very, very old news. TNT2 was never used as a 32-bit graphics card by almost anybody, it was a marketing tick box for nVidia that kind of worked in swaying public opinion. I had friends who used TNT and TNT2 back in its heyday. On their miserable Celeron 333/366 systems 32-bit was not even a possibility, they just bragged about it but never used it. On my PII-300, the Voodoo 3 flew.

A500:Rev6|+512K|ACA500+|C1084S
i386:Am386SX25|4M|GD5402|ES688
i486:U5S33|8M|GD5428|YMF719
i586:P133|32M|T64V+/MX2|V1|CT3980/32M
i686:K6-2/500|256M|i740|V2/SLI|CT4520/32M
S370:P3-1200|384M|GF4-4200|MX300
S754:A3700+|2G|X800XTPE|SB0350

Reply 13 of 26, by Garrett W

User metadata
Rank Member
Rank
Member

32bit has little to do with CPU performance, it mostly lies on the GPU itself. It is not merely a marketing tick as you suggest, other companies such as ATi and Matrox were also featuring 32bit rendering on their cards, in fact ATi's Rage 128 was pretty bad at 16bit initially and the Rage 128 Pro had hardware fixes AFAIR to solve this although those cards took far less of a hit at 32bit than competitors. 32bit could be used on less demanding games and/or resolutions and it did provide an improvement, although in most cases it was indeed not worth it IMO.

However, features such as this and larger texture support did make the card a little bit more future proof. Games like MDK2 and Max Payne looked far better on hardware that actually supported larger textures, I mention these two because they went as far as to use larger textures on loading screens or comic strip cutscenes in the case of Max Payne, which look pretty bad on cards that did not feature this. Also, didn't Neverwinter Nights demand an OpenGL compliant GPU with 32bit rendering? I seem to remember being unable to play that at the time 🙁.

Sure, this was a few years down the line and like you say, budget oriented cards like GF2 MX and in fact even faster cards like the GF4 MX were out by then probably, but a lot of people must have gotten a little bit more mileage out of their cards this way.

Reply 14 of 26, by BinaryDemon

User metadata
Rank Member
Rank
Member

It’s probably just a simple 32mb > 16mb marketing trick.

I’m not sure how close the TNT2 and Voodoo3 actually were, but I did go from Voodoo3 to GeForce back-in-the-day and that was a small but noticeable improvement in all the games I played.

Check out DOSBox Distro:

https://sites.google.com/site/dosboxdistro/ [*]

a lightweight Linux distro (tinycore) which boots off a usb flash drive and goes straight to DOSBox.

Make your dos retrogaming experience portable!

Reply 15 of 26, by cyclone3d

User metadata
Rank l33t
Rank
l33t
texterted wrote on 2020-01-12, 23:59:

There's a lot of "rose tinted" hype with 3dfx. Sure, they were good for a while but it was a short while!

For me the Voodoo 2 was their pinnacle. Later cards on the AGP format that didn't actually use the AGP bus kind of shows the desperation as the competition caught up and quickly surpassed them.

Just my 2p's worth.

The Voodoo 3 isn't the one that bombed. It was more the Voodoo 4 / 5 that wasn't up to the task. But the real reason that 3dfx went down the tubes from what I gather was mismanagement.

As far as the AGP vs PCI thing, I did some testing a while back, and for DOS anyway.. with DOOM, the AGP version of the V3-2000 was way, way, way faster than the PCI V3-2000. Now it might be a different story IF the PCI bus had been running at 66Mhz like AGP bus was.

Yamaha YMF modified setupds and drivers
Yamaha XG resource repository - updated November 27, 2018
Yamaha YMF7x4 Guide
AW744L II - YMF744 - AOpen Cobra Sound Card - Install SB-Link Header
Epstein didn't kill himself

Reply 16 of 26, by Unknown_K

User metadata
Rank Oldbie
Rank
Oldbie

3dfx went under because it was short on cash which lead it to buy a plant in Mexico so it could build its own cards (and then nobody else was using their chips anymore). Voodoo 4/5 was late and expensive (64MB of RAM but only 32MB usable per chip). The Geforce series is what killed 3dfx and game developers ditching glide only games. Gamers only care about speed and pretty graphics so fast 32 bit video with bigger textures and not being tied to glide anymore sank 3dfx.

People loved Vodoo2 SLI because it was fast and you got 1024x768 instead of 800x600 for a single card.

Collector of old computers, hardware, and software

Reply 17 of 26, by chinny22

User metadata
Rank l33t
Rank
l33t

Voodoo's main advantage is the Glide API, but that was starting to loose against D3D around say 1998.
Plus the fact Nvidia cards were more competitive with just about every company producing their own card vs 3dfx been limited to in house production.

In 1998 when I was upgrading from our 486 to a Gateway P2 400 my friend who was up to date on tech recommended upgrading the basic video to a 16MB TNT. It wasn't stupid expensive and meant I could play every game of the era.
He went with a Voodoo2 which meant he also needed to get a decent 2D card for the D3D titles anyway. He ended up selling the voodoo on for a Nvida card (cant remember which)

It's only now am I enjoying reliving games with 3dfx cards, It's like playing games on real hardware vs dosbox, it just feels right more then been the best way to do it

Reply 18 of 26, by sliderider

User metadata
Rank l33t++
Rank
l33t++

As soon as the original GeForce dropped, the writing was on the wall for many other card makers. The Voodoo 4/5 arriving late and without features and performance that nVidia cards had, was the end for 3DFX. ATi was the only one that upped their game enough to remain competitive, and they had a few stumbles along the way.

32-bit color wouldn't be a thing until then, so it was largely irrelevant when it first started to appear until devs would support it more. The performance hit to early cards that supported it was just too great. 3DFX would have still been a viable choice.

Reply 19 of 26, by 386SX

User metadata
Rank Oldbie
Rank
Oldbie

As far as I remember in many discussions and reviews the TNT2 were considered a "better" card than the usual Voodoo3 2000/3000. The whole "16 bit colors is enough" helped a bit along with the Glide APIs that were already in their last moments.
As others said, I'd say the Voodoo2 with the SLI option was their best solution, after that anything felt like just taking time, for what anyone couldn't say.
The Voodoo3 chip could have been a solution instead of the Voodoo2 itself, there were not much more beside frequencies that could have not been done years before, like a Banshee chip. The VSA-100 should have been what the Voodoo3 was and still not enough considering what was coming in the future if we think to the Geforce and the Radeon chips. Maybe the big error was to continue thinking the Glide APIs would have been there forever when it was already quite discussed the Directx/Opengl could have been the future since years.
The competion was high and the 3dfx brand only wasn't enough alone to compete.

Last edited by 386SX on 2020-01-18, 12:58. Edited 1 time in total.