VOGONS


Worst video card ever, again

Topic actions

Reply 80 of 102, by Skyscraper

User metadata
Rank l33t
Rank
l33t
havli wrote:

Indeed, new drivers makes very little difference in performance terms. I did the testing few years ago and the result is 8800 GTX is 55% faster at 1600x1200 4xAA, 16xAF or 39% faster at 1600x1200 noAA, 16xAF.

This is average of 17 games, pretty much zero CPU limitation - i5 2500k @ 4.5 GHz. More info here http://hw-museum.cz/benchmark-3-1.php

Power consumption on the other hand is not that bad. 8800 GTX = 147W idle / 302W load. 2900 XT = 145W idle / 290W load.

We will see, I have never really trusted any benchmark results I have not confirmed my self, I just need find a cheap card. I have passed up on cheap X2900XT cards before so there is some phycological resistance to pay much at all in play. 😁

I skimmed through some reviews after sunaiac claimed that the HD X2900XT wasnt that slow at release and indeed depending on the titles benchmarked and the resolution used different reviewers managed to get very different results.

If it indeed turns out that regardless of drivers and CPU the X2900XT generally gets beaten even by the HD3870 then I would think the talk of broken MSAA hardware and other defects could be true as the X2900XT dosnt look bad at all on paper.

Many reviewers benched at 2560*1600 so I guess thats where AMD thought the card started showing strength even if it diddnt always turn out that way, at least not back then.

The only AMD card from that generation I bought new was a HD3850 for my spare computer used by friends for LAN play and there was of course a reason for that. The Geforce 8800 series (all of them) was the better performer for sure in the game I played 99% of the time, World of Warcraft.

Last edited by Skyscraper on 2016-03-29, 11:56. Edited 2 times in total.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 81 of 102, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
sunaiac wrote:
Except in real life, it was 346W (2900) vs 314 (8800) and the 2900 was 26% faster than the 8800 GTS 320. But why say the truth w […]
Show full quote
candle_86 wrote:

HD 2900XT consumed twice as much power as an 8800GTX, you could cook a steak on it in 5 minutes, and it preformed worse than the 8800GTS 320.

Except in real life, it was 346W (2900) vs 314 (8800) and the 2900 was 26% faster than the 8800 GTS 320.
But why say the truth when FUD is so much more interesting.

(http://www.hardware.fr/articles/671-1/ati-rad … hd-2900-xt.html)

Testing graphics card consumption by running pixel shader test from 3DMark is quite far from real life.

Reply 82 of 102, by candle_86

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:
We will see, I have never really trusted any benchmark results I have not confirmed my self, I just need find a cheap card. I ha […]
Show full quote
havli wrote:

Indeed, new drivers makes very little difference in performance terms. I did the testing few years ago and the result is 8800 GTX is 55% faster at 1600x1200 4xAA, 16xAF or 39% faster at 1600x1200 noAA, 16xAF.

This is average of 17 games, pretty much zero CPU limitation - i5 2500k @ 4.5 GHz. More info here http://hw-museum.cz/benchmark-3-1.php

Power consumption on the other hand is not that bad. 8800 GTX = 147W idle / 302W load. 2900 XT = 145W idle / 290W load.

We will see, I have never really trusted any benchmark results I have not confirmed my self, I just need find a cheap card. I have passed up on cheap X2900XT cards before so there is some phycological resistance to pay much at all in play. 😁

I skimmed through some reviews after sunaiac claimed that the HD X2900XT wasnt that slow at release and indeed depending on the titles benchmarked and the resolution used different reviewers managed to get very different results.

If it indeed turns out that regardless of drivers and CPU the X2900XT generally gets beaten even by the HD3870 then I would think the talk of broken MSAA hardware and other defects could be true as the X2900XT dosnt look bad at all on paper.

Many reviewers benched at 2560*1600 so I guess thats where AMD thought the card started showing strength even if it diddnt always turn out that way, at least not back then.

The only AMD card from that generation I bought new was a HD3850 for my spare computer used by friends for LAN play and there was of course a reason for that. The Geforce 8800 series (all of them) was the better performer for sure in the game I played 99% of the time, World of Warcraft.

R600 and all its deratives had broken AA/AF, the hardware unit was defective and instead of redesign it they instead off loaded that work to the shaders. Also yes 320 shaders vs 128 looks impressive but it was a 5:1 ratio of simple to complex. So in truth for complex shading which included AA and AF they only had 64 at any time for use, meaning they actually had less resources than the G80 to work with. But ATI did not expect the 8800GTX to be what it was, and likely assumed that 64 complex was enough.

Reply 83 of 102, by Skyscraper

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

R600 and all its deratives had broken AA/AF, the hardware unit was defective and instead of redesign it they instead off loaded that work to the shaders. Also yes 320 shaders vs 128 looks impressive but it was a 5:1 ratio of simple to complex. So in truth for complex shading which included AA and AF they only had 64 at any time for use, meaning they actually had less resources than the G80 to work with. But ATI did not expect the 8800GTX to be what it was, and likely assumed that 64 complex was enough.

That at least explains why the R600 seems to perform well at really high resolution but drops more performance than the G80 when AA is enabled.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 84 of 102, by W.x.

User metadata
Rank Member
Rank
Member

Hello, I would like to react for this original thread "worst video card ever" Worst video card ever
because it is locked. I didn't read this thread tho, but I have to stand for defense of FX5200. It is not bad card, and definetly don't deserve worst card ever title. It's at the best "below average".
I've revealed, why there is so many mixed opinions, but that's not fault of "FX5200", but card sellers. They've masked they've crippled it, and Nvidia never should allow that, or wanted, so it is mentioned, that card is 64-bit, and not 128-bit. The card sellers could manipulate through this, and sell these cards crippled, what costed them less to make such card. The worst company in those time, what made it, was definetely Gainward. It was masked their product with good pro design, box, and names, as Powerpack! Pro/ 660, TV DVI etc... but most of them were cripled, and you've needed to know, what to pick. I didnt was caught by FX5200 with this marketing, but happened to me in MX440, where I just choosen one of the Geforce 4 MX440, not cheapest (MX420), but not too expensive, as I didnt have much money in 2002. And all those names looked so pro, so I've just choosen some of the lower ones MX440, and of course, it was very thin, 64-bit version, which I've revealed 15 years after it, when I started to care about retro components, building old builds, remember, what cards I had. It was nice red thin card, I was suprised Im buying Geforce 4 and so small tiny, lightweight card appearead out of the box. Well... I was scammed. But didnt know much about hardware in that time, and fallen into this trap.

And that's exactly, what happened with FX5200!!! Some cards are not only 64-bit, instead 128-bit, but even 333 Mhz or 266 Mhz DDR at memory, instead of stock 400!!! And Nvidia should definetly make them allow only in separate name, like FX5200 LE or something. This is the reason, why many people, mostly in young and teenager age in that time of 2003, who didnt know much about hardware, just bought FX5200 and was dissapointed , bashed it hardly, and telling, it's slower and worse than Geforce MX440, or Geforce 2 Ti. Which is not true.

No tell for that table of FX5200 cards from GPuzoo... and find Gainwards one. You can definetly see, they made most of crippled versions, with 64bit bus and lower than 400 mhz memory.
https://www.gpuzoo.com/GPU-NVIDIA/GeForce_FX_5200.html

What is worse, those cards are often "thick" variants, so look so large, so compelte, as 128-bit versions. (usually 128-bit versions were thick, and 64-bit version were thin , PCB-size wise). But Gainward masked them, so they've looked very good (often solid state capacitors), thick PCB, nice design, pro name.
Like, this card doesnt look like slowest, most crappy FX5200 ever made, right?
https://www.cnet.com/products/gainward-fx-pow … fx-5200-128-mb/
But it is.
When we look at specification, no matter that it is thick, and look so pro, also it's name, acording GPUzoo , you see, that it has 266 Mhz memory.
And yes, I have one. I've bought it as my first FX5200, to have just one, I've paid even 2x more than later for other FX5200, and it worst FX5200 ever made.
I didn't understand, how could it have -75B on the memory, when it should be 400 Mhz on FX5200... but now I understand. -75B on memory means 7.5ns, which is rated at 266 Mhz... 🤣, so I've got caught again, and again, on Gainward. So dangerous to buy cards from this company blindly. So many companies, and I've choosen blindly to buy Geforce 4 MX440 in 2003... of course was feeling, its so slow for its price. And it shouldn't. Non-crippled versoin of MX440 was decent.

And same story it is with FX5200.

Now look at the best version of FX5200, some versions had -4ns memory, althought they are clocked at 400 Mhz, you can be sure, you get well over 500 mhz with them. And it was way to go. To buy 128-bit and -4 ns memory version, just go for it. They usually were only tiny more expensive, than other versions.
And FX5200 had DX 8.1 support, and was good overclocker, as low end cards usually can be overclocked so much. So in this case of best FX5200, we can get overclock usually in range of 300-315 Mhz core and 550-580 Mhz memory. Which is just after FX5200 Ultra. Btw FX5200 Ultra was not soo good overclocker, and often could not be raised by much. So with these FX5200 overclocked, you were like 5-10% behind FX5200 Ultra at stock speed, which is not bad for budget bard.

So this is, what made so many controversy about FX5200, and why it was marked by some people as worst card ever. But its not, it were the cheating sellers and their ignorance... also lack of knowledge about overclocking. For someone, who knows the stuff, and pick -4ns memory FX5200 and 128-bit, he could expect close to FX5200 Ultra performance, which was not bad for budget, and "bottom of line" card. I would say, it was good card. Not best, but not bad, or average. For the price, it was decent.

Now, you see, companies like Gainward, were definetly hiding the negative staff, making to look card, and box so pro, while masking, most of their cards were 64-bit, and (266 Mhz memory instead 400). I'm really sorry for people, who have fallen to this trap, buying pro-looking wide version of FX5200, while it was slowest possible variant, like this one:

"Gainward FX PowerPack! Pro/660 TV-DVI
Barcode: 471846200-8057
core: 250 MHz 128 MB DDR memory 266 MHz 64-bit bus."

So finally, I've solved mistery about my inability to identify memories on FX5200 card I've got as first, that have -75B at the end, which made no sense for me at the start, as I've expected, they should be 400 Mhz.... It is Gainward. Problem solved. It looked so pro, so I've took it, with lack of knowledge, look at those solid state capacitors. It is looking as high quality stuff, while, you're getting worst possible FX5200 card, which I think is behind that group of people, that bashed FX5200 so badly in its time, and caused that earned reputation to be "worst game card ever" and "slower than MX440".
file.php?mode=view&id=116722

Now, I've found my second FX5200 card that I have, that looking not impressive, green PCB from jetway. It is this card
file.php?mode=view&id=116730
https://www.svethardware.cz/recenze-srovnani- … eon-9200/8991-2
except, my have -4 at the memory, so I've won a jackpot and have 4ns memory on that card. Not sure why, probably, later revision.
But even with overclock only 275core/500 mhz memory, we can see already, how close to FX5200 Ultra this card is (note: FX5200 Ultra is also overclocked):
https://www.svethardware.cz/recenze-srovnani- … eon-9200/8991-9

With 4ns memory, if you get them to 550+ mhz, and you also get more lucky with core, getting 300 mhz, I would say, FX5200 is budget beast, like 5% behind FX5200 Ultra. This card will smoke MX440, not to mention it supports DX 8.1. Bad card? For that price? Definetly not. When you compare it, what you've got just year before with much more pricey MX440, this budget beast FX5200, if picked right, could outperform even MX460, for half of the money, while offering DX8.1 support. That's not bad for card, that is on the lowest bottom of the serie (FX5100 was only OEM card)

Now look at the both cards. Could someone without proper knowledge know, that we are looking at the slowest and fastest variants of the FX5200 series? And the one that is beast is generic and plane looking green-PCB one, with common name (Jetway FX5200), with cheaper looking box? While more badass looking red one, with badass name, and nicer box is the slowest one possible. Probably not. Most people would pick blindly Gainward version and choosing this way slowest FX5200 possible. So this is in my opinion origin of "FX5200 is among worst cards ever" badge, in my opinion misjudgly earned.

Attachments

  • 03_big.jpg
    Filename
    03_big.jpg
    File size
    81.75 KiB
    Views
    1093 views
    File license
    Public domain
  • 71iBZIr+8ZL._AC_SL1000_.jpg
    Filename
    71iBZIr+8ZL._AC_SL1000_.jpg
    File size
    113.05 KiB
    Views
    1098 views
    File license
    Public domain

Reply 85 of 102, by dionb

User metadata
Rank l33t++
Rank
l33t++

IMHO the FX5200 (or rest of FX series) shouldn't be here. No, they are not great, they were a disappointment in performance terms and ATI' alternatives at the time were a better buy. But they still pretty reliably did what they were supposed to do. A "worst" card should fail badly at that, either lacking promised features, support or stability.

In that sense the S3 Savage 2000 (or if we're talking card: Diamond Viper II Z200, just about the only implementation of this chipset) would make more sense, contending with GeForce256 for crown of first PC card with T&L. Just one slight issue: no T&L with default drivers, and if you hacked drivers to enable it, it was utterly broken.

Then again, with default drivers the Savage 2000 was stable and no terrible performer, even if it did lack its key feature. There must be worse out there...

Reply 86 of 102, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

just about the only implementation of this chipset

Inno3D also made those. And I think one other Taiwan/China based company too. But they probably just bought unsold chip stock from VIA.

candle_86 wrote:

R600 and all its deratives had broken AA/AF

First of all, AF wasn't really broken on R600 series. It's just wasn't very spectacular due to low amount of texture units. And secondly, was AA really broken? I doubt that. More likely it was just designed that way to be more flexible with upcoming technologies (deferred rendering, etc). R600 was very forward thinking chip, which failed due to lack of raw processing power.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 87 of 102, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
dionb wrote on 2021-08-14, 17:35:

Then again, with default drivers the Savage 2000 was stable and no terrible performer, even if it did lack its key feature. There must be worse out there...

I never had this card, but one of my buddies did. Or maybe it was a Savage4? I can't remember exactly. Anyhow, back in the day, he had higher performance and better visual quality in UT '99 with his Savage than I did with my TNT2. And his card was a lot cheaper. Granted, this is likely an edge case, but seeing how Unreal Tournament was our favorite game at the time, that memory stayed with me to this day. The S3 Metal API and S3 Texture Compression were both ahead of their time.

On topic, the worst cards for me are the ones that have a completely crippled memory bus (i.e. 64-bit), paired with lower than average core and memory speeds and no indication anywhere that those cost saving measures were applied. Even though some of those cards can be decent performers for retro gaming purposes, the brazen dishonesty of the manufacturers makes my blood boil.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 88 of 102, by dionb

User metadata
Rank l33t++
Rank
l33t++
Joseph_Joestar wrote on 2021-08-14, 18:33:
dionb wrote on 2021-08-14, 17:35:

Then again, with default drivers the Savage 2000 was stable and no terrible performer, even if it did lack its key feature. There must be worse out there...

I never had this card, but one of my buddies did. Or maybe it was a Savage4? I can't remember exactly. Anyhow, back in the day, he had higher performance and better visual quality in UT '99 with his Savage than I did with my TNT2. And his card was a lot cheaper. Granted, this is likely an edge case, but seeing how Unreal Tournament was our favorite game at the time, that memory stayed with me to this day. The S3 Metal API and S3 Texture Compression were both ahead of their time.

S3 Metal was great - but you could get that with a Savage 4, the 2000 didn't add much there apart from marginally higher bandwidth. What it was supposed to add was T&L, and it didn't.

On topic, the worst cards for me are the ones that have a completely crippled memory bus (i.e. 64-bit), paired with lower than average core and memory speeds and no indication anywhere that those cost saving measures were applied. Even though some of those cards can be decent performers for retro gaming purposes, the brazen dishonesty of the manufacturers makes my blood boil.

Yes, that kind of shenanigans is awful. Nothing inherently wrong with low-spec, unless someone sells it as high-spec. Still, that doesn't make the card itself bad, just the people who sold it.

Another set of video cards I'd nominate: almost all high-end cards (both major vendors) from ~2010 that had a tendency to self-desolder, failing prematurely. Great that you could fix stuff by putting it in the oven, but essentially they were all flawed products, regardless of how great their specs and performance were when they did work.

Reply 89 of 102, by Ydee

User metadata
Rank Oldbie
Rank
Oldbie
W.x. wrote on 2021-08-14, 13:26:

What is worse, those cards are often "thick" variants, so look so large, so compelte, as 128-bit versions. (usually 128-bit versions were thick, and 64-bit version were thin , PCB-size wise). But Gainward masked them, so they've looked very good (often solid state capacitors), thick PCB, nice design, pro name.

I think they just used full-fledged PCBs, which they had a lot of, I wouldn't see the intent. Other manufacturers did likewise - PCBs were full-fledged, with unpopulated positions. Makes no sense to use expensive PCBs with cheap components (slow DDR, etc.) Maybe it was when FX5200(non Ultra) had a bad reputation and sales and they needed to get rid of stock at the lowest possible cost of production?

Reply 90 of 102, by W.x.

User metadata
Rank Member
Rank
Member
Ydee wrote on 2021-08-15, 09:41:

I think they just used full-fledged PCBs, which they had a lot of, I wouldn't see the intent. Other manufacturers did likewise - PCBs were full-fledged, with unpopulated positions. Makes no sense to use expensive PCBs with cheap components (slow DDR, etc.) Maybe it was when FX5200(non Ultra) had a bad reputation and sales and they needed to get rid of stock at the lowest possible cost of production?

Well, another reason can be 3 connectors... if they would use thin PCB design, the one on the side needed to be connected through cable, and I've seen this design many times. But maybe, it came later. Maybe at beginning, before 6600 series, more connectors was solved by using large PCB. Because it's true, I didn't see it on Nvidia cards before FX 5200.

Reply 91 of 102, by 386SX

User metadata
Rank l33t
Rank
l33t

I would not say the Savage2000 cards (I think the Diamond Viper II wasn't the only one, my memory seems to remember a 64MB on-chip variant of the card maybe built later I don't know if someone remembers it) might be listed here. It probably couldn't give what promised and something went seriously wrong in the design of its "famous" T&L engine but at least with latest drivers I remember it having a very good VGA output quality and quite a fast DX6 card for those time correct games.
Beside the Metal API and the S3TC, at the end it wasn't that bad and it deserve to be remembered as the only one that tried to compete "in time" with the Geforce on a "similar" feature level. The real competitors came later but at that time they at least tried a last battle in the discrete graphic card market. It remembers me an old sport movie quote after the main character lost the final round and the other player said "I gotta hand it to you, Roy. When you go down, you go down in flames". That was the way the Savage2000 lost. 😁

Reply 92 of 102, by Gmlb256

User metadata
Rank l33t
Rank
l33t
386SX wrote on 2021-08-19, 17:04:

I would not say the Savage2000 cards (I think the Diamond Viper II wasn't the only one, my memory seems to remember a 64MB on-chip variant of the card maybe built later I don't know if someone remembers it) might be listed here. It probably couldn't give what promised and something went seriously wrong in the design of its "famous" T&L engine but at least with latest drivers I remember it having a very good VGA output quality and quite a fast DX6 card for those time correct games.
Beside the Metal API and the S3TC, at the end it wasn't that bad and it deserve to be remembered as the only one that tried to compete "in time" with the Geforce on a "similar" feature level. The real competitors came later but at that time they at least tried a last battle in the discrete graphic card market. It remembers me an old sport movie quote after the main character lost the final round and the other player said "I gotta hand it to you, Roy. When you go down, you go down in flames". That was the way the Savage2000 lost. 😁

The only redeeming thing of the S3 Savage was the S3TC texture compression format which later becomes widely adopted by hardware vendors for use with Direct3D and OpenGL.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 93 of 102, by 386SX

User metadata
Rank l33t
Rank
l33t
Gmlb256 wrote on 2021-08-19, 17:34:

The only redeeming thing of the S3 Savage was the S3TC texture compression format which later becomes widely adopted by hardware vendors for use with Direct3D and OpenGL.

The S3 Savage video chips weren't a bad solution when times are seen from a modern point of view. They had most feature even 3dfx chips still didn't have beside at the speed of a Voodoo II but they were also cheaper cards. They had low power requirements, some well built brands were using good PCB designs and in a time were DVD/MPEG2 decoding was a expected feature they had Motion Compensation for the MPEG2 decoding. I think they never tried to be the "best" cards while the Savage2000 maybe wanted to but still was too late or too soon for such move.
Thinking back to those times I wish I had bought a Savage4 card over the expensive Voodoo3 2000 I bought when released for my slow K6-II computer.

Reply 94 of 102, by Gmlb256

User metadata
Rank l33t
Rank
l33t
386SX wrote on 2021-08-19, 17:49:

The S3 Savage video chips weren't a bad solution when times are seen from a modern point of view. They had most feature even 3dfx chips still didn't have beside at the speed of a Voodoo II but they were also cheaper cards. They had low power requirements, some well built brands were using good PCB designs and in a time were DVD/MPEG2 decoding was a expected feature they had Motion Compensation for the MPEG2 decoding. I think they never tried to be the "best" cards while the Savage2000 maybe wanted to but still was too late or too soon for such move.
Thinking back to those times I wish I had bought a Savage4 card over the expensive Voodoo3 2000 I bought when released for my slow K6-II computer.

For me the Voodoo cards (especially the later ones) were pure brute force in comparison to what other hardware vendors were doing (MPEG-2 decoding, hardware T&L, tile-based rendering, etc.) and this led them to their downfall on the long run. I believe that it was the lack of vision and poor management on 3Dfx when dealing with the future of the graphics hardware.

Their earlier products were great but you can notice elements of brute force on Voodoo2.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 95 of 102, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++
386SX wrote on 2021-08-19, 17:49:

Thinking back to those times I wish I had bought a Savage4 card over the expensive Voodoo3 2000 I bought when released for my slow K6-II computer.

While I respect that the Savage is technologically interesting, heck I was eager for it's release back in the day after all the teasers about it, I can't say I regret buying my V3 2000. I was a few months late to the party on that, some while after release and got it for $50 cash with stacking coupons. Ran great in my K6-2-450 machine. I was playing coverdisk demos into 2002ish, before I think BattleField 1942 was the first thing I couldn't run. Though already had plans by then to get an AXP system. But for a number of years after it was still good on budget/bargain-bin games (I wish those were still a thing, on physical media, in stores, no activation) .. Given I've still got the box for sure and could probably round up the CD and everything, I could probably make a profit selling it, even considering how much compound interest would have given me if I'd have stuck $50 in the bank.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 96 of 102, by 386SX

User metadata
Rank l33t
Rank
l33t
Gmlb256 wrote on 2021-08-19, 18:54:

For me the Voodoo cards (especially the later ones) were pure brute force in comparison to what other hardware vendors were doing (MPEG-2 decoding, hardware T&L, tile-based rendering, etc.) and this led them to their downfall on the long run. I believe that it was the lack of vision and poor management on 3Dfx when dealing with the future of the graphics hardware.

Their earlier products were great but you can notice elements of brute force on Voodoo2.

Imho just as fast the Voodoo brand gain so much deserved popularity was lost just as fast once the STB move and the Avenger release discovered the weak point every competitors were probably waiting for.
The Riva128 chip should have been a serious alarm for the future but seems like even the TNT chip and the others brand ones were not cared at all to move on with a completely new chip. Imho the Avenger should have never been called Voodoo3, that brand was too powerful and useful to use on such "boosted Banshee". And after that a much late card named Voodoo"5" and a very late Voodoo"4" as Geforce2 MX competitor? At which release price?
On the marketing side there was no way the Voodoo3 might be convincing as a next generation card. The only real point was to have the Voodoo2 SLI speed on a single chip but single chips already existed even from the same company! I remember every review underlined the 16bit "problem" even if far from be a real problem it was on the marketing side and nothing could solve that. The Avenger might have had sense if called Banshee II and taking time to release a real GPU. No other moves should have been done and the time to do something was really few.

Reply 97 of 102, by 386SX

User metadata
Rank l33t
Rank
l33t
BitWrangler wrote on 2021-08-19, 19:13:

While I respect that the Savage is technologically interesting, heck I was eager for it's release back in the day after all the teasers about it, I can't say I regret buying my V3 2000. I was a few months late to the party on that, some while after release and got it for $50 cash with stacking coupons. Ran great in my K6-2-450 machine. I was playing coverdisk demos into 2002ish, before I think BattleField 1942 was the first thing I couldn't run. Though already had plans by then to get an AXP system. But for a number of years after it was still good on budget/bargain-bin games (I wish those were still a thing, on physical media, in stores, no activation) .. Given I've still got the box for sure and could probably round up the CD and everything, I could probably make a profit selling it, even considering how much compound interest would have given me if I'd have stuck $50 in the bank.

I also still have the original box and another Voodoo3 2000 and 3000 collected. I do have good memories of running Half Life on it and other previous games but still the K6-2 350Mhz could push it not nearly enough and even with the K6-2 550Mhz things weren't much better. I really wanted to see 3dfx coming back to the high end products but I remember feeling how things were like never enough advanced for the very fast time of that period. I remember the time taken for the first VSA-100 release was quite painful to read on the tech newspaper that talked about it. I don't remember how many month it took but I think to remember were quite a lot.
But the Voodoo3 was indeed a fast chip but technically obsolete and might have been good if released earlier only as low and high end solution @ 183Mhz and for a short time (and another name). No 3500 with or without TV and not other moves and then the whole Voodoo architecture imho should have been left to the past. The Voodoo2 already was a risk and imho saved by the SLI concept that really was the main (good) point there and ironically probably making the future Avenger chip looks like slower than it could have been appeared if the V2 SLI wasn't there.

Reply 98 of 102, by Gmlb256

User metadata
Rank l33t
Rank
l33t
386SX wrote on 2021-08-19, 20:23:

I remember every review underlined the 16bit "problem" even if far from be a real problem it was on the marketing side and nothing could solve that.

The thing with the 16-bit color mode on 3Dfx cards was that they were using dithering techniques so it won't look ugly as an actual 16-bit color mode and this led to people say that the image quality was blurred and smeared.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 99 of 102, by 386SX

User metadata
Rank l33t
Rank
l33t
Gmlb256 wrote on 2021-08-19, 20:43:

The thing with the 16-bit color mode on 3Dfx cards was that they were using dithering techniques so it won't look ugly as an actual 16-bit color mode and this led to people say that the image quality was blurred and smeared.

But at the end the point wasn't the almost invisible (almost) differences from 16bit to 32bit not even close to the 256 colors to 65536 colors jump but it sure was a problem on the important marketing side to explain "why" a new card/chip called Voodoo 3 after the great success of both previous solutions had basically similar features and almost no 'really new' ones even if not really useful but at least a Motion Compensation engine or whatever might have helped. And instead things like the 3000 "DVD playback assist" feature really was difficult to accept when even a cheap Savage4 had almost everything the market needed.
Anyway I liked 3dfx cards, I wanted so much a Voodoo and the Voodoo II but I could not afford a new system until 1998 so I went for the Voodoo3 and still was hoping for the VSA-100 even if I do remember the feelings that came from the various reviews.