VOGONS


Worst video card ever

Topic actions

  • This topic is locked. You cannot reply or edit posts.

First post, by sliderider

User metadata
Rank l33t++
Rank
l33t++

So I found this ancient thread over at Arstechnica today

http://arstechnica.com/civis/viewtopic.php?f= … f83689116f6001d

And thought I'd bring the discussion here. There's been a lot more video cards released since then, but what is the worst video card ever? Also, it's not really fair to compare a card to one that came out much later and is obviously faster or better. Try to isolate your "bad" video card comparisons to other cards of the same time period. I also have to define worst as being a card that utterly failed to live up to it's manufacturers claims, was broken in some way or tried and failed to move the industry in a new direction by introducing some new proprietary technology that was never widely adopted and died out from lack of support.

So, I'm sure most will agree that just about any card with the Trident name on it is a piece of crap. I also put the Intel i740 on the list along with the Matrox Parhelia.

Reply 1 of 43, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie

3DFX Voodoo Rush was a mistake in my opinion... And any S3 Virge card was abysmal when it came to 3D games.

The i740 was not too bad and better than most people think. It was better than the ATI Rage II/Pro and 3DLabs Permedia 2 for gaming (back in it's day).

Reply 4 of 43, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

The i740 was not too bad and better than most people think. It was better than the ATI Rage II/Pro and 3DLabs Permedia 2 for gaming (back in it's day).

Yeah, on some cards you have to consider context. Sure, the i740 was no match for the Voodoo2, only older/lower end chips like the Rage and Riva128. But it was also a $60 card.

However, I do consider the i740 a huge failure, because it didn't do what it was supposed to. It was originally released mainly to demo the new AGP interface. Well, having a slow card for a tech demo is bad enough, but what made it even worse was the fact that the PCI version of the card was faster. Oops.

Reply 6 of 43, by sliderider

User metadata
Rank l33t++
Rank
l33t++
F2bnp wrote:

I pick Nvidia Geforce FX 5200. It boasted DirectX 9 features and was slower than the GeForce 4 MX. Great plan!

If you look at the FX5500, it is the same chipset as the FX5200 (NV34) but with faster clocks. I think nVidia made up the FX5200 as an outlet for NV34 chips that couldn't pass quality control at FX5500 speeds. Even the FX5500, though, still sucked compared to the Radeon 9500/9700. Those Radeons completely changed the graphics card game when they were released.

Reply 7 of 43, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:
F2bnp wrote:

I pick Nvidia Geforce FX 5200. It boasted DirectX 9 features and was slower than the GeForce 4 MX. Great plan!

If you look at the FX5500, it is the same chipset as the FX5200 (NV34) but with faster clocks. I think nVidia made up the FX5200 as an outlet for NV34 chips that couldn't pass quality control at FX5500 speeds. Even the FX5500, though, still sucked compared to the Radeon 9500/9700. Those Radeons completely changed the graphics card game when they were released.

I pretty much bypassed the whole dreadful "FX" range and stuck with ATI during that time 😀

The Geforce 4 MX was also very poor for it's time of release... and misleading too! I think it's just a tweaked GF2.

Reply 8 of 43, by swaaye

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:

If you look at the FX5500, it is the same chipset as the FX5200 (NV34) but with faster clocks. I think nVidia made up the FX5200 as an outlet for NV34 chips that couldn't pass quality control at FX5500 speeds.

The GeForce FX 5200 Ultra came out with the 5200 launch and has the highest clocks of any of those NV34 cards. I think that the 5500 is just a plain 5200 with a slight clock bump and a name change to get the clueless folks at Best Buy excited. 😀

Reply 9 of 43, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

Yeah, the original "real" 5200 and 5200 Ultras really weren't bad cards considering their market position. It was mainly the cut-down versions that gave the series bad name... the low-clocked ones, and especially the 64-bit versions.

Reply 11 of 43, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie

I almost forgot about the Horrible "Number Nine Revoltution IV" by SGI which uses the "Ticket to Ride IV" chipset.

It was released around the time of the Voodoo 3, Nvidia TNT2, S3 Savage 4, Matrox G400 and ATI Rage 128... it could not compete with any of them! It was slow, expensive, had iffy 3D quality and came with the worst drivers i've ever had the misfortune of using! It even shipped without OpenGL support 😒

Reply 12 of 43, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:
sliderider wrote:

If you look at the FX5500, it is the same chipset as the FX5200 (NV34) but with faster clocks. I think nVidia made up the FX5200 as an outlet for NV34 chips that couldn't pass quality control at FX5500 speeds.

The GeForce FX 5200 Ultra came out with the 5200 launch and has the highest clocks of any of those NV34 cards. I think that the 5500 is just a plain 5200 with a slight clock bump and a name change to get the clueless folks at Best Buy excited. 😀

I thought the Ultra came out after and replaced the original 5200 and 5500.

I wouldn't mind finding a Gainward Golden Sample version of the 5200 Ultra, though. That one had the highest clocks of any NV34, period.

Last edited by sliderider on 2010-08-17, 12:41. Edited 1 time in total.

Reply 14 of 43, by sliderider

User metadata
Rank l33t++
Rank
l33t++
leileilol wrote:
swaaye wrote:

Consider this and think about 2002-3 gaming.

NV NV3x : D3D9 😵 / D3D8 🙁 / OpenGL 😵
ATI R3x0 : D3D9 😁 / D3D8 😁 / OpenGL 😀

Corrected. I'd also nominate the FX series.

And let's not forget the hair dryer they called a fan on the FX5800 Ultra.

http://www.youtube.com/watch?v=G3Auqto8FpE

Reply 15 of 43, by Zup

User metadata
Rank Oldbie
Rank
Oldbie

My vote is for SiS 6326 PCI video cards. Most cards you've mentioned actually works, but as far as I remember those SiS cards were incompatibles with some Intel chipsets (don't remember if they were HX, TX or VX).

I have traveled across the universe and through the years to find Her.
Sometimes going all the way is just a start...

I'm selling some stuff!

Reply 16 of 43, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
eL_PuSHeR wrote:

I had one SiS 6326 at one time and it had the best VESA 2.0 implementation I have seen.

Well, that would make the PCI version atleast of a use then 😀

The noise the FX5800 made was totally terrible. I'd never want a card like that in any of my systems. It's noisy and puts out a lot of heat.

Ok, I got another one...anyone ever tried the Blade 3D?

Btw, making a twist in the original topic, is there any graphics card in existence that is basically useless?

Theres no worse graphics card then a card that's completely useless.
This might include graphics cards that had horrible drivers, horrible OS and game support and graphics cards that were very prone to hardware failure.

If we were to look at it that way, then one card that would be almost useless would be...the Rampage!!! 😁 (I'd still want one though hehe 😁 )

Reply 17 of 43, by bushwack

User metadata
Rank Oldbie
Rank
Oldbie

Any card using the Cirrus Logic Luguana 3D (CL-GD5464).

I bought a Teckworks Ultimate 3D back in 1997 thinking it was going to kick ass because it used Rambus memory like the Nintendo 64. Techworks was nice enough to give me a full refund. The constomer service guy asked me why I wanted to return the card, and I told him because of it's horrible erformance. He agreed.

Unfortunately I bought the card before this issue of Boot came out (Aug 1997 issue 12).

laguana3d_boot12.jpg

Reply 19 of 43, by SquallStrife

User metadata
Rank l33t
Rank
l33t

I'd have to nominate the Ark PCI card I had in my first Pentium system. It was utterly useless.

2MB of VRAM, no VBE support, unstable drivers for the first year we had it, got very hot for no reason.

It was replaced with a Tseng ET6000 before long.