VOGONS


Worst video card ever, again

Topic actions

Reply 60 of 102, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie
vlask wrote:

Wanted to test Matrox G100 too, bud mine cards wont boot into Windows 98 SE (but they do to Windows XP). Dunno why. So i skipped their testing....

I've only run my G100 under Windows ME & windows 2000. It works but looks horrid, pretty much a TURBO Mystique 220, but the 2D is not as crisp and sharp as the mystique either.

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 61 of 102, by nforce4max

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

I only think of AT3D as that chip used on some Voodoo Rush boards.

Pretty much and I got two of them, runs hot in 2D games and just like in the videos the image quality sucks. Works ok in some games on the 2D side but it is what it is. There is only one other chip out there that was actually worse and pretty rare by Macronix. Had a chance to buy one at a fair price and let it pass. The 3D side of the voodoo rush isn't bad at all and rather decent with no complaints other than getting hot like everything else from that era.

On a far away planet reading your posts in the year 10,191.

Reply 62 of 102, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie
FaSMaN wrote:

That's precisely the problem , the market here was flooded with the cheap Chinese manufacturer varients of it, hence why it caused a problem , especially in the 1998-2001 erra where some games required a 3D Card

The 6326 was OK in 1998-2000, and the fact that SiS carried on updating the drivers later was a bonus. But compared to a Nvidia TNT(or even RIVA 128) or Voodoo3 or Kyro it struggled.
A lot of boards had in on board though, including the SiS 530 chipset had it embedded with shared memory and others like ECS & the PCCHIPS M590 had 8MB variants located on board.
I can only recall one game of the era that refused to run on 6326 and that was DRIVER by criterion, and it even stated it on the box.

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 63 of 102, by vlask

User metadata
Rank Member
Rank
Member

Done with AT3D - now online also videos from Unreal, Forsaken, X demo and Expendable.

https://www.youtube.com/playlist?list=PLOeoPV … G20fBVYn2fynUaa

Now i moved to Matrox MGA Millennium 4MB PCI - luckily for Matrox they never sold it as 3D game card. Because this one missing textures support. Still some games have no hardware checks, so if you hate modern fancy stuff like textures 😎 or fog/flare effect, this beauty would be your first choice...Only videos from Turok, Incoming and X demo were made. Other applications had checks, so they won't run.

https://www.youtube.com/playlist?list=PLOeoPV … oqZWl07k9-21GV8

Not only mine graphics cards collection at http://www.vgamuseum.info

Reply 64 of 102, by vlask

User metadata
Rank Member
Rank
Member

New videos from another 2 cards you really hate to have them back then.... 3Dlabs Permedia

https://www.youtube.com/playlist?list=PLOeoPV … M_7BrH0xz_IsNxD

and Trident 3DImàge 9750

https://www.youtube.com/playlist?list=PLOeoPV … lPyzglCKMdJbxKU

Not only mine graphics cards collection at http://www.vgamuseum.info

Reply 65 of 102, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie

totally bizarre looking transparency on Final reality with the permedia! I remember running that on a diamond Fire GL PRO AGP(permedia 2) and it was OK, they must have made a lot of differences on the "2", they obviously fixed it big style!

I know I ran a Trident 3D Image 4MB AGP recently too, I'm fairly sure my performance was much better than that, think visually were better too(I know i used the drivers from vintage3d.org). What system are u running these cards in, I used a Duron 650 MHz on a DFi AK74, frankly it was cheating as this is much faster than machines from the day or the cards release.
anyhow, thanks for sharing, very interesting, you have a subscriber here 😀
Best,
Chris

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 66 of 102, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie

So is there any game that Alliance AT3D can handle acceptably?

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 67 of 102, by vlask

User metadata
Rank Member
Rank
Member
BSA Starfire wrote:

So is there any game that Alliance AT3D can handle acceptably?

Only Forsaken. If you dont mind about blinking textures. But i dont have many testing games. Screenshots at vintage3d maybe tell more...

http://vintage3d.org/images/AT3D/gallery.php

Not only mine graphics cards collection at http://www.vgamuseum.info

Reply 68 of 102, by vlask

User metadata
Rank Member
Rank
Member
BSA Starfire wrote:
I know I ran a Trident 3D Image 4MB AGP recently too, I'm fairly sure my performance was much better than that, think visually w […]
Show full quote

I know I ran a Trident 3D Image 4MB AGP recently too, I'm fairly sure my performance was much better than that, think visually were better too(I know i used the drivers from vintage3d.org). What system are u running these cards in, I used a Duron 650 MHz on a DFi AK74, frankly it was cheating as this is much faster than machines from the day or the cards release.
anyhow, thanks for sharing, very interesting, you have a subscriber here 😀
Best,
Chris

Pentium 3 1Ghz Coppermine, Gigabyte GA-60XET rev. 1, Intel 815EP, 512MB SDR, Win98SE. Video been done on faster card i got recently (DFI - mem clock 91MHz), benchmarks i did on slower oem version with 50MHz mem. Not a big change through.

Not only mine graphics cards collection at http://www.vgamuseum.info

Reply 69 of 102, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie
vlask wrote:
BSA Starfire wrote:

So is there any game that Alliance AT3D can handle acceptably?

Only Forsaken. If you dont mind about blinking textures. But i dont have many testing games. Screenshots at vintage3d maybe tell more...

http://vintage3d.org/images/AT3D/gallery.php

I just read through the vintage3D review again and looked at screenshots, it doesn't look like it can run anything right. This could well be the "worst" 3D card.

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 70 of 102, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie
vlask wrote:
BSA Starfire wrote:
I know I ran a Trident 3D Image 4MB AGP recently too, I'm fairly sure my performance was much better than that, think visually w […]
Show full quote

I know I ran a Trident 3D Image 4MB AGP recently too, I'm fairly sure my performance was much better than that, think visually were better too(I know i used the drivers from vintage3d.org). What system are u running these cards in, I used a Duron 650 MHz on a DFi AK74, frankly it was cheating as this is much faster than machines from the day or the cards release.
anyhow, thanks for sharing, very interesting, you have a subscriber here 😀
Best,
Chris

Pentium 3 1Ghz Coppermine, Gigabyte GA-60XET rev. 1, Intel 815EP, 512MB SDR, Win98SE. Video been done on faster card i got recently (DFI - mem clock 91MHz), benchmarks i did on slower oem version with 50MHz mem. Not a big change through.

So a little bit quicker than my Duron maybe. I will have to re-run some tests with my Trident 3D Image, I'm sure it was better than that. I know I ditched it out as it was dreadful at NFS 3 so I installed a STB Velocity RIVA 128 instead.

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 71 of 102, by nforce4max

User metadata
Rank l33t
Rank
l33t
BSA Starfire wrote:

So is there any game that Alliance AT3D can handle acceptably?

There are some like Blade Runner and Age of Empires works fine, there are others however the image quality wasn't great but no where near as bad as in the videos.

On a far away planet reading your posts in the year 10,191.

Reply 72 of 102, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
nforce4max wrote:
BSA Starfire wrote:

So is there any game that Alliance AT3D can handle acceptably?

There are some like Blade Runner and Age of Empires works fine, there are others however the image quality wasn't great but no where near as bad as in the videos.

Software renderers are fine, 3d accelerated games look horrible. For me it is also the worst 3d card. However, we have yet to see first Mpact in action, if that counts as 3d card.

Reply 73 of 102, by candle_86

User metadata
Rank l33t
Rank
l33t

HD 2900XT consumed twice as much power as an 8800GTX, you could cook a steak on it in 5 minutes, and it preformed worse than the 8800GTS 320.

Reply 74 of 102, by Skyscraper

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

HD 2900XT consumed twice as much power as an 8800GTX, you could cook a steak on it in 5 minutes, and it preformed worse than the 8800GTS 320.

While this was the case back then making your point valid I doubt the HD 2900XT would be slower than even the 8800 GTX if benched with newer drivers and a really fast CPU. ATI/AMD had huge issues with CPU overhead back then, for some reason they still do today. This is why AMD GPUs tend to be more competetive at the end of their retail lifecycle then at the start, the Radeon HD 7970 ---> Radeon R9 280X is a good example.

When I happen to stumble on a HD 2900XT I will look into this.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 75 of 102, by nforce4max

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

HD 2900XT consumed twice as much power as an 8800GTX, you could cook a steak on it in 5 minutes, and it preformed worse than the 8800GTS 320.

Period drivers were absolutely horrendous and even caused issues with vlc (like wtf) but most of us already know that ATI/AMD drivers always start out like this. At least they get better with age like booze. 😀

On a far away planet reading your posts in the year 10,191.

Reply 76 of 102, by sunaiac

User metadata
Rank Oldbie
Rank
Oldbie
candle_86 wrote:

HD 2900XT consumed twice as much power as an 8800GTX, you could cook a steak on it in 5 minutes, and it preformed worse than the 8800GTS 320.

Except in real life, it was 346W (2900) vs 314 (8800) and the 2900 was 26% faster than the 8800 GTS 320.
But why say the truth when FUD is so much more interesting.

(http://www.hardware.fr/articles/671-1/ati-rad … hd-2900-xt.html)

R9 3900X/X470 Taichi/32GB 3600CL15/5700XT AE/Marantz PM7005
i7 980X/R9 290X/X-Fi titanium | FX-57/X1950XTX/Audigy 2ZS
Athlon 1000T Slot A/GeForce 3/AWE64G | K5 PR 200/ET6000/AWE32
Ppro 200 1M/Voodoo 3 2000/AWE 32 | iDX4 100/S3 864 VLB/SB16

Reply 77 of 102, by vlask

User metadata
Rank Member
Rank
Member

Added Matrox Mystique 220 - not a good loking image, but at last its bit faster....(sometimes)

https://www.youtube.com/playlist?list=PLOeoPV … 4GJloQ2a0GmNh7f

Not only mine graphics cards collection at http://www.vgamuseum.info

Reply 78 of 102, by candle_86

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:
candle_86 wrote:

HD 2900XT consumed twice as much power as an 8800GTX, you could cook a steak on it in 5 minutes, and it preformed worse than the 8800GTS 320.

While this was the case back then making your point valid I doubt the HD 2900XT would be slower than even the 8800 GTX if benched with newer drivers and a really fast CPU. ATI/AMD had huge issues with CPU overhead back then, for some reason they still do today. This is why AMD GPUs tend to be more competetive at the end of their retail lifecycle then at the start, the Radeon HD 7970 ---> Radeon R9 280X is a good example.

When I happen to stumble on a HD 2900XT I will look into this.

nah the HD 2900XT stayed slower than the HD3870 until the end, it never caught up to the 3870 so it was never even as fast as an 8800GT

Reply 79 of 102, by havli

User metadata
Rank Oldbie
Rank
Oldbie

Indeed, new drivers makes very little difference in performance terms. I did the testing few years ago and the result is 8800 GTX is 55% faster at 1600x1200 4xAA, 16xAF or 39% faster at 1600x1200 noAA, 16xAF.

This is average of 17 games, pretty much zero CPU limitation - i5 2500k @ 4.5 GHz. More info here http://hw-museum.cz/benchmark-3-1.php

Power consumption on the other hand is not that bad. 8800 GTX = 147W idle / 302W load. 2900 XT = 145W idle / 290W load.

HW museum.cz - my collection of PC hardware