VOGONS


First post, by noshutdown

User metadata
Rank Oldbie
Rank
Oldbie

the concept "worst" means incomplete 3d features, slow performance, buggy drivers and broken graphics, cards that can run d3d and opengl properly do not count.
some candidates are:
s3virge
cirrus5464/5465
alliance at3d
trident9750
ati rage
chromatic mpact
matrox mystique
and some others.
the nv1 is an exception, it did perform badly but it was intended to do quadrilateral mapping rather than rendering triangles, so it belongs to another forgotten world and should not compete in this lane at all.

Reply 1 of 35, by Kruton 9000

User metadata
Rank Newbie
Rank
Newbie

The most broken card by far is Alliance ProMotion AT3D.
But you asking about Open GL support. Video cards with official Direct 3D and Open GL support are generally support them properly. Early d3d cards mostly lacked OGL drivers and used mini GL libraries or Open GL to [some 3d API] wrappers.

Reply 3 of 35, by 386SX

User metadata
Rank l33t
Rank
l33t

The aT3D wasn't that broken and a 1997 accelerator, not really the first one at the end. Maybe the early ATi Rage was closer to it.

Reply 4 of 35, by bakemono

User metadata
Rank Oldbie
Rank
Oldbie

Do mobile chips count? Trident 9525 was a bad one. Some of the driver releases had D3D support disabled by default. Also, S3 Virge MX.

GBAJAM 2024 submission on itch: https://90soft90.itch.io/wreckage

Reply 5 of 35, by Postman5

User metadata
Rank Member
Rank
Member

Hi,
Savage4 LT is also a contender
Unreal menu is not displayed correctly (DirectX mode)
The game itself looks good, other games, for example, Quake2 work without problems
Number Nine SR9 SD
Win98, driver ChroMetal45_9xme

Reply 6 of 35, by PD2JK

User metadata
Rank Oldbie
Rank
Oldbie

Disclaimer: I was 8 years old when this happend.

The worst experience I had with 3D was Forsaken running on an unknown Soltek mainboard with K6-2 500 and a Riva 128ZX (or TNT?). Objects where stretched and displayed incorrectly.

So the PC went back to the store and my parents contribute more money and trade it for an Athlon 700 + TNT2 M64. The mainboard is a Gigabyte GA-7IXE.

Years later, it could be just a driver issue, since Forsaken just supports the card.

Still have that system, running stable.

i386 16 ⇒ i486 DX4 100 ⇒ Pentium MMX 200 ⇒ Athlon Orion 700 | TB 1000 ⇒ AthlonXP 1700+ ⇒ Opteron 165 ⇒ Dual Opteron 856

Reply 7 of 35, by wierd_w

User metadata
Rank Oldbie
Rank
Oldbie

(Knows this is about discrete cards/graphics. Cannot contain the emotional outrage. Posts OT anyway.)

At one point, I had an otherwise fantastic Thinkpad, that had good dos sound, and a nice LCD, but the fucking thing had a Neomagic Magicgraph video chipset, which had directx support, BUT NO 3D FEATURES AT ALL.

I had to cheese it with anything that wanted to render to a D3D interface using SwiftShader.

It was awful.

As for grunky 'actually 3d featured' cards, some of the cards from the 90s rendered with quads instead of triangles, and did real strange stuff when their preferred API was not called.

Reply 8 of 35, by noshutdown

User metadata
Rank Oldbie
Rank
Oldbie
Postman5 wrote on 2024-11-15, 14:20:
Hi, Savage4 LT is also a contender Unreal menu is not displayed correctly (DirectX mode) The game itself looks good, other games […]
Show full quote

Hi,
Savage4 LT is also a contender
Unreal menu is not displayed correctly (DirectX mode)
The game itself looks good, other games, for example, Quake2 work without problems
Number Nine SR9 SD
Win98, driver ChroMetal45_9xme

its all the driver's fault that the savage4 didn't do well, s3 had never been able to develop proper 3d drivers from the start till the end. but other than that its a decent card, at least much better than the previous savage3, which in turn is far better than virge, and even the virge is probably still not at the bottom.

Reply 9 of 35, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Probably the original S3 Virge 325

- perspective texturing is super slow
- no blending functions
- no alpha modulation
- very strange depth precision issues
- no stencil/depth ops (that i know of)
- no alphatests
- tries to dither alpha'd out texels 🤣

apsosig.png
long live PCem

Reply 10 of 35, by noshutdown

User metadata
Rank Oldbie
Rank
Oldbie
PD2JK wrote on 2024-11-15, 14:37:
Disclaimer: I was 8 years old when this happend. […]
Show full quote

Disclaimer: I was 8 years old when this happend.

The worst experience I had with 3D was Forsaken running on an unknown Soltek mainboard with K6-2 500 and a Riva 128ZX (or TNT?). Objects where stretched and displayed incorrectly.

So the PC went back to the store and my parents contribute more money and trade it for an Athlon 700 + TNT2 M64. The mainboard is a Gigabyte GA-7IXE.

Years later, it could be just a driver issue, since Forsaken just supports the card.

Still have that system, running stable.

cmon, riva128 is a cult card for me. not only that its nvidia's first successful card, it had also been the fastest card for a short while until voodoo2's release.
the riva128 is also the second fastest of all first generation 3d cards(those with single rendering pipeline), only behind the matrox g200 which was several months later and more expensive.

Reply 11 of 35, by PD2JK

User metadata
Rank Oldbie
Rank
Oldbie
noshutdown wrote on 2024-11-15, 15:11:
PD2JK wrote on 2024-11-15, 14:37:
Disclaimer: I was 8 years old when this happend. […]
Show full quote

Disclaimer: I was 8 years old when this happend.

The worst experience I had with 3D was Forsaken running on an unknown Soltek mainboard with K6-2 500 and a Riva 128ZX (or TNT?). Objects where stretched and displayed incorrectly.

So the PC went back to the store and my parents contribute more money and trade it for an Athlon 700 + TNT2 M64. The mainboard is a Gigabyte GA-7IXE.

Years later, it could be just a driver issue, since Forsaken just supports the card.

Still have that system, running stable.

cmon, riva128 is a cult card for me. not only that its nvidia's first successful card, it had also been the fastest card for a short while until voodoo2's release.
the riva128 is also the second fastest of all first generation 3d cards(those with single rendering pipeline), only behind the matrox g200 which was several months later and more expensive.

Yeah sorry, maybe I was a bit off topic then. It wasn't the card. Just a bad experience which could've been solved with a newer driver.

i386 16 ⇒ i486 DX4 100 ⇒ Pentium MMX 200 ⇒ Athlon Orion 700 | TB 1000 ⇒ AthlonXP 1700+ ⇒ Opteron 165 ⇒ Dual Opteron 856

Reply 12 of 35, by DEAT

User metadata
Rank Member
Rank
Member

Direct3D is easy, choose any of the following: ATI Rage, Number Nine Imagine 128-II, Matrox Millennium. They all claim Direct3D support, but are missing critical features (no Z-buffering for the ATI Rage, no texturing for Imagine 128-II and Millennium). At least for the ATI Rage it has about 30 games out of the ~200 or so I tested from 1996-1998 that function to some extent where it's borderline playable, while the other two can run 25 and 19 games respectively and have a unique exception: Independence Day and Moto Racer 2 fall back to software HAL emulation that I cannot replicate on 2D accelerators or any other 3D accelerators. I've been meaning to post a thread about these cards, but other projects are getting in the way. S3 Virge 325/Alliance AT3D/Trident 3dImage975/Cirrus GD5464 are not even in the same league.

3DFX has one of the most disappointing OpenGL ICDs that I've ever seen for anything up to Voodoo 3. You'd think ex-SGI devs would know what they're doing for OpenGL 1.1 compliance, but the amount of corners they cut is absurd when you go outside the boundaries of the same four benchmarks of Quake 1/2/3/Unreal that everyone does and actually do proper unit testing (protip: FOSS games with Win98 builds are excellent for this). The Voodoo 1/Rush/2 "ICD" is a complete joke and always requires copying the ICD as opengl32.dll to game folders to get it to even do something, if it doesn't shit the bed. I don't own a Voodoo 4/5 so I can't comment on those.

The SiS6326 ICDs (the "Java" beta and AOpen driver) are unique for two reasons - The Java beta reports itself as OpenGL 1.0 according to Minetest, while the AOpen driver reports itself as being OpenGL 1270-compliant, I never knew there were that many revisions of the OpenGL spec! Compatibility is marginally worse compared to Voodoo Banshee/3 with the AOpen driver, I haven't fully tested the Java ICD in detail.

3DLabs Wildcat cards are the only OpenGL cards with >1.1 compliance that has overall terrible support when it comes to games - it falls outside the scope of being primitive, but it's an amusing thing to note as the Matrox G200 ICD runs rings around it.

Reply 13 of 35, by auron

User metadata
Rank Oldbie
Rank
Oldbie
DEAT wrote on 2024-11-29, 04:33:

3DFX has one of the most disappointing OpenGL ICDs that I've ever seen for anything up to Voodoo 3. You'd think ex-SGI devs would know what they're doing for OpenGL 1.1 compliance, but the amount of corners they cut is absurd when you go outside the boundaries of the same four benchmarks of Quake 1/2/3/Unreal that everyone does and actually do proper unit testing (protip: FOSS games with Win98 builds are excellent for this). The Voodoo 1/Rush/2 "ICD" is a complete joke and always requires copying the ICD as opengl32.dll to game folders to get it to even do something, if it doesn't shit the bed. I don't own a Voodoo 4/5 so I can't comment on those.

according to brian hook, the situation with V1/V2 is because these are add-on boards and shipping with an ICD would "hose the system's primary display device's ICD", as he put it. when your main card might also come with its own ICD, copying over the file to use the voodoo seems like a sane solution. and Q3 lets you switch in the settings menu anyway.

then supposedly, their ICD is a wrapper to glide3, so maybe they had to cut those corners to get overhead to a reasonable level. i don't know of any other game before Q3 that used OGL but had no miniGL support, and CAD type stuff was out of the question for V1/V2 because they didn't support rendering in a window. the miniGL reduced overhead so that's what you wanted anyway for games, which is easy to forget now when everyone uses these cards with 1ghz+ setups.

finally, the FOSS games you refer to are probably from the mid-2000s. with something ~5 years later like that you might have better luck with using the mesafx ICD. with games of its time, i've never noticed any issues with the V3-V5 ICD, besides obviously higher CPU overhead. IMO, what's much more worthy of calling out are the completely broken 9x drivers that nvidia put out across their product range - for instance, i once went through every major gf2 gts driver and none of them is able to render ut2004 to reference, despite that being essentially a d3d7-class game. or the benchmark cheating stunts that ati pulled with the 8500.

Reply 14 of 35, by leileilol

User metadata
Rank l33t++
Rank
l33t++

The only terribly broken i've noticed about 3dfx's ICD are in the handling of clipping areas. Some versions of the ICD had busted clamping. There's also the required behavior of forcing the window to the top so it doesn't render partially

FOSS games tend to ignore the v1/v2 icd because SDL doesn't care!!!! (SDL actually has the functionality but no one ever uses it.). A custom SDL build that goes to 3dfxVGL.dll and forcing the window at 0x0 would probably solve a bunch of issues (though not the issues of the lack of optimization and tech art mistakes in contributor free-for-all matters of videogame - the ABA games are probably the safest bets to try as far as the hardware this thread is concerned.)

auron wrote on 2024-11-29, 21:39:

i don't know of any other game before Q3 that used OGL but had no miniGL support

Return Fire II, Starsiege Tribes, WWII Fighters, Descent 3

apsosig.png
long live PCem

Reply 15 of 35, by noshutdown

User metadata
Rank Oldbie
Rank
Oldbie
DEAT wrote on 2024-11-29, 04:33:

Direct3D is easy, choose any of the following: ATI Rage, Number Nine Imagine 128-II, Matrox Millennium. They all claim Direct3D support, but are missing critical features (no Z-buffering for the ATI Rage, no texturing for Imagine 128-II and Millennium). At least for the ATI Rage it has about 30 games out of the ~200 or so I tested from 1996-1998 that function to some extent where it's borderline playable, while the other two can run 25 and 19 games respectively and have a unique exception: Independence Day and Moto Racer 2 fall back to software HAL emulation that I cannot replicate on 2D accelerators or any other 3D accelerators. I've been meaning to post a thread about these cards, but other projects are getting in the way. S3 Virge 325/Alliance AT3D/Trident 3dImage975/Cirrus GD5464 are not even in the same league.

does "S3 Virge 325/Alliance AT3D/Trident 3dImage975/Cirrus GD5464 are not even in the same league" mean that they are better or even worse?

The SiS6326 ICDs (the "Java" beta and AOpen driver) are unique for two reasons - The Java beta reports itself as OpenGL 1.0 according to Minetest, while the AOpen driver reports itself as being OpenGL 1270-compliant, I never knew there were that many revisions of the OpenGL spec! Compatibility is marginally worse compared to Voodoo Banshee/3 with the AOpen driver, I haven't fully tested the Java ICD in detail.

i have tried the "java" beta driver and i think its decent. it couldn't run quake3 but quake2 and glexcess are ok. it could even run 3dmark01 in 640*480*16bit mode, the image is less than ideal but not messed up.

Reply 16 of 35, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote on 2024-11-15, 15:03:
Probably the original S3 Virge 325 […]
Show full quote

Probably the original S3 Virge 325

- perspective texturing is super slow
- no blending functions
- no alpha modulation
- very strange depth precision issues
- no stencil/depth ops (that i know of)
- no alphatests
- tries to dither alpha'd out texels 🤣

Ahh... but you can run Destruction Daarby for the Stealth 2000 (@15fps) and Terminal Velocity S3D without compatibility issues, effectively decelerating your system down to a halt!

7fbns0.png

tbh9k2-6.png

Reply 17 of 35, by DEAT

User metadata
Rank Member
Rank
Member
leileilol wrote on 2024-11-30, 02:44:
auron wrote on 2024-11-29, 21:39:

i don't know of any other game before Q3 that used OGL but had no miniGL support

Return Fire II, Starsiege Tribes, WWII Fighters, Descent 3

In addition to those above, B.I.O Freaks, Bugs Bunny: Lost in Time, Global Defender, Homeworld and Spec Ops: Rangers Lead the Way are all examples of games that either predate or were close enough to Q3 release. I'm about 90% certain that B.I.O Freaks' OpenGL renderer is completely broken. There's a pre-GPL build of BZFlag from 1998 that I've confirmed the existence of:
http://discmaster.textfiles.com/browse/19242/ … bzflag_demo.zip

There's the possibility that GLTron may be another example, but the earliest Windows build I've found is 0.53 from January 2000 while an earlier BeOS version exists from mid-1999.

auron wrote on 2024-11-29, 21:39:

the miniGL reduced overhead so that's what you wanted anyway for games, which is easy to forget now when everyone uses these cards with 1ghz+ setups.

finally, the FOSS games you refer to are probably from the mid-2000s.

leileilol wrote on 2024-11-30, 02:44:

(though not the issues of the lack of optimization and tech art mistakes in contributor free-for-all matters of videogame

These two specific comments made me go back to revisit my notes that I last worked on back in June - I did all of my testing with a 1.4Ghz Tualatin more because I was interested in compatibility issues rather than performance. The whole reason why I went down this rabbit hole in the first place was because of a comment that leilei posted elsewhere about how Quake 3 forced vendors into OpenGL 1.1-compliance, and that got my noggin firing because I figured that FOSS games would be a great example of really testing OpenGL 1.1 drivers as almost all of them have the ability to disable GL extensions.

Only TA: Spring 0.75 specifically enforces gl_ext_texture_env_combine support, and curiously the only pre-OpenGL 1.3 ICD that supports it is the Kyro II. Trigger Rally 0.6.1 implicitly requires OpenGL 1.2 support, Nexuiz 2.5.1 needs a Radeon 8500/Geforce FX or better to view models (2.4 is fairly wonky, but 2.3.2 seems better overall - haven't tested earlier versions), and certain other games (UFO: Alien Invasion 2.2.1, TeeWorlds/TeeWars 0.2.3, Vega Strike 0.5) only work properly on one or two OpenGL 1.1 cards. There's a few games that require a Core 2 Duo/Athlon 64 and a R300/FX card to get decent performance, Xonotic 0.1.0 (!!!) is the most demanding while amusingly being able to run on most OpenGL 1.1 cards with various compatibility caveats, performance being ignored. To give credit, there's some good ways of being able to actually stress the performance of rocket Win98 builds.

There's only one known actively-developed FOSS game that still supports Win98, Armagetron Advanced.

Going back to the optimisation comments in general - there's several OpenGL games that work perfectly fine on 500Mhz K6-2 or 533Mhz Coppermine CPUs, even going down as far as PMMX233s and even P133s! Rather than write some more, I'll just throw a few raw videos here captured via OSSC, with timestamps in the Youtube descriptions:

533Mhz Coppermine (133FSB * 4.0) + AGP 128-bit Geforce 2 MX
https://www.youtube.com/watch?v=6NAMUDeFF0Y

same as above, but with a PCI Matrox G450
https://www.youtube.com/watch?v=DwE8OM3L22Y

233Mhz Pentium MMX + PCI Geforce FX 5200
https://www.youtube.com/watch?v=pSRMR39vDkY

133Mhz Pentium + PCI Geforce FX 5200 - a few games need music/sound disabled to get good performance as they use OGGs
https://www.youtube.com/watch?v=DO13O7fg1dM

I wanted to do a Super Socket 7 video, but man... I've heard about ALi Aladdin V AGP issues which I could never reproduce with Direct3D games, but OpenGL games are a complete can of worms. I wish my MVP3 mobo wasn't in a vegetative state of a POST code loop.

auron wrote on 2024-11-29, 21:39:

according to brian hook, the situation with V1/V2 is because these are add-on boards and shipping with an ICD would "hose the system's primary display device's ICD", as he put it. when your main card might also come with its own ICD, copying over the file to use the voodoo seems like a sane solution. and Q3 lets you switch in the settings menu anyway.

This makes sense - it does feel like to me that the V1/Rush/V2 ICDs are little more than renamed miniGL drivers.

with something ~5 years later like that you might have better luck with using the mesafx ICD. with games of its time, i've never noticed any issues with the V3-V5 ICD, besides obviously higher CPU overhead.

The biggest issue I've seen with the Banshee (my V3 is broken atm, and I last did my tests about six months ago where it was already suffering from heat issues but my understanding is that the two are effectively the same architecture-wise) is that several games require the desktop resolution to match the in-game resolution, otherwise everything will either render from the bottom-left and either not fill the screen or overdraw, or will render from the top-left with a vertical off-set that also cuts off the bottom third of a screen. 256x256 texture size limitation particularly hurts 2D games (though StepMania is aware of this and compensates), plus there's other oddball issues. Here's another video:
https://www.youtube.com/watch?v=0zIiXlIOsvw
and this one I recorded separately of xmoto 0.2.7, because I didn't realise that the data had somehow corrupted while recording the first video and I did a reinstall:
https://www.youtube.com/watch?v=1ZDiNQfn7ig

I haven't tried the mesafx ICD yet, so I'll have to look into that further.

auron wrote on 2024-11-29, 21:39:

IMO, what's much more worthy of calling out are the completely broken 9x drivers that nvidia put out across their product range - for instance, i once went through every major gf2 gts driver and none of them is able to render ut2004 to reference, despite that being essentially a d3d7-class game.

DX8-class NVidia drivers have completely useless ICDs except for 30.42. The only real difference I've seen between 7.76 and 30.42 for the GF2mx is that UFO: Alien Invasion displays textures properly in 30.42.

leileilol wrote on 2024-11-30, 02:44:

the ABA games are probably the safest bets to try as far as the hardware this thread is concerned.)

For the most part, that's true - the i740 immediately crashes with every single one except for Titanion while windowed, the G400 (but not the G200!) is weirdly unique in that most of them will immediately crash on non-SSE CPUs. As for specific games:

  • A7Xpg - Permedia 2, Riva 128 and V2100 have overall poor rendering quality
  • Gunroar - Riva 128 has overall poor render quality while the SiS Xabre has poor text rendering, glitchy logo rendering and the gamma is too dark at the start
  • Mu-cade - Permedia 2 renders the logo incorrectly, V2100 renders the world with incorrect colours, Savage IX renders visual effects with more intensity than usual
  • Noiz2sa - Gamma is too low on Rage Pro/XL, Savage IX and Voodoo Rush - this is one of the few games in my entire list of games that actually works on the Rush
  • PARSEC47 - V2100 renders objects with incorrect colours - Voodoo Rush can run the attract mode, but crashes when trying to get in-game
  • rRootage - Permedia 2 and Riva 128 have transparency issues - Voodoo Rush can run the attract mode, but crashes when trying to get in-game
  • Titanion - i740 and Rage 128 crashes with fullscreen, but is fine windowed
  • Torus Trooper - V2100 renders the world with incorrect colours (slightly? need to double check this one specifically)
  • Tumiki Fighters - SiS 315 and Xabre both render the front-facing side of objects as solid black - Voodoo Rush can run the attract mode, but crashes when trying to get in-game

With all that said, the ABA Games are the only thing that I'm certain that the SiS 6326 is actually providing a performance benefit compared to the MS software ICD, even if the lack of bilinear filtering is obvious on Torus Trooper.

Speaking of, the SiS 6326 is definitely the worst in compatibility and performance now that I've revisited it. The Permedia 2 is the slowest overall out of all usable ICDs, while the Rage Pro, Trident Blade and SiS 305 are fairly sluggish in general and the Rage Pro/Trident Blade both have extremely poor performance with StepMania and SuperTux. I couldn't be bothered checking the i740 again for comparative performance. As far as one pixel pipeline/one TMU cards are concerned, the Banshee and G200 are the best in overall performance though the G200 has far less compatibility issues - it does get hurt pretty badly with World of Padman's heavy texture requirements, though.

noshutdown wrote on 2024-12-05, 10:51:

does "S3 Virge 325/Alliance AT3D/Trident 3dImage975/Cirrus GD5464 are not even in the same league" mean that they are better or even worse?

I figured the "missing critical functionalities" part should have made my post regarding Direct3D pretty obvious when it comes to "worst primitive 3d card for d3d", but it looks like I was wrong.

The ATI 3D Rage is particularly unfortunate in that with the press release announcement they had explicitly mentioned Direct3D support, but Microsoft shafted them hard there.

Reply 18 of 35, by leileilol

User metadata
Rank l33t++
Rank
l33t++

as i've made OpenArena (one of the games involved) and know the mistakes in it, I know where all the pitfalls are and learned lessons the hard way:

- Many models use way too many textures which contribute to texture switching penalties (can get really bad for GLES). The default model used a single texture so that'd explain the wide preference to forcemodel.
- Detail texture crazy on a renderer that has no proper way with dealing with them. Multitexturing wasn't used for these as the blending mode differs from the lightmap/texture. It doesn't do what Elite Force did (which was a small 32x32 fuzz multiplied on the lightmap and then the texture multiplies onto that.)
- way too many flares that read the depth every frame makes some maps slow, even on modern hardware as video memory reads are an evergreen problem. Also some cards don't like depth reading (like Kyro) so they don't draw at all. Current generation hardware also compromise the depth so the flares flicker now!
- Because of some overreliance on shader stages for a look, vertex lighting mode can look very crappy
- The game was made mostly on a Radeon x850 which had a godly fillrate for the time and didn't have a lot of foresight on earlier hardware (or the vocal bsd/linux players stuck on llvmpipe). Nowadays I have a desire to have it working better on ~98 3d hardware and hacking in some additional creative fallbacks (which itself is an ideological conflict with the floss scene that want sdl3 pbr modern rendering in vulkan)
- Many textures are big and don't come compressed so they're unoptimized for the Geforce2 to begin with, and the nVidia ICD has had a history of a bad on-demand compressor, that's been a subject of a few "it's a Shit! VooDoo5 superior" arguments here. As S3TC patent's expired, it should be safer to have DDSes to load now without bad faith from pissing in GNU/cereal with a microsoft image container. (i hope)
- Some nVidia ICDs are slower with the faster path (r_primitives 2), which was fixed around detonator 12.41
- Not related to the pitfalls but more about Win9x support, the current tree can be compiled for Windows 95 with GCC 4.7.2 and using a particular wspiapi.h from ReactOS. Still requires Winsock2 DLLs to be present however

Last edited by leileilol on 2024-12-16, 05:18. Edited 4 times in total.

apsosig.png
long live PCem

Reply 19 of 35, by Linoleum

User metadata
Rank Member
Rank
Member
noshutdown wrote on 2024-11-15, 14:59:
Postman5 wrote on 2024-11-15, 14:20:
Hi, Savage4 LT is also a contender Unreal menu is not displayed correctly (DirectX mode) The game itself looks good, other games […]
Show full quote

Hi,
Savage4 LT is also a contender
Unreal menu is not displayed correctly (DirectX mode)
The game itself looks good, other games, for example, Quake2 work without problems
Number Nine SR9 SD
Win98, driver ChroMetal45_9xme

its all the driver's fault that the savage4 didn't do well, s3 had never been able to develop proper 3d drivers from the start till the end. but other than that its a decent card, at least much better than the previous savage3, which in turn is far better than virge, and even the virge is probably still not at the bottom.

Yeah I did a Savage 4 built where 99% of the challenge was getting the right driver with the right minigl and the right metal patch. Once you get it, it's a pretty good card (if you pretend Voodoo3 and TNT2 don't exist).

P3 866, V3, SB Audigy 2
P2 300, TNT, V2, Audigy 2 ZS
P233 MMX, Mystique 220, V1, AWE64
P100, S3 Virge GX, AWE64, WavetablePi & PicoGus
Prolinea 4/50, ET4000, SB 16, WavetablePi
486DX2 66, CL-GD5424, SB 32, SC55
SC386SX 25, TVGA8900, Audician32+