Reply 160 of 185, by kool kitty89
wrote:One has to wonder if S3 had put effort into porting games to S3D if Virge would have been more successful in the eyes of gamers. Verite V1000 is almost as useless for D3D and OGL as Virge-arch chips after all, but Rendition put a lot of effort into getting popular games supported.
On top of that, there's also the issue of buggy DirectX and nonexistent OpenGL drivers, especially early on. (it seems there were some semi-decent -or at least more compatible- D3D drivers late in the Virge/DX/GX's life)
From what I can gather, ATi's RAGE drivers of the time weren't much better than S3's, but they certainly got many more RAGE-specific ports than S3 did. Plus, they got a relatively fast card out (with the Rage Pro) where the ViRGE line stagnated in the performance range of the Rage II.
On the note of S3D-specific ports, it also would have been good to promote use of the ViRGE's strengths, like use of 32-bit color depth (which was only moderately slower and much better looking -smooth, undithered shading/blending) as well as stress for inclusion of decent detail setting options for decent user flexibility. (since the ViRGE runs fairly decently with full features and truecolor at low screen resolutions -like 320x240 or 400x300, or also runs pretty well at higher resolutions with texture filtering disabled)
With full features enabled at 32-bit color, the ViRGE potentially has better visual quality than the Voodoo (or Verite -let alone Rage), though to keep-up speed-wise you'd have to drop the resolution, so an obvious trade-off. (or, for games catering fairly well to -if not specifically intended for- unfiltered textures, opting for lower visual quality at high resolution might be the more competitive option against the Voodoo as well -and still with considerably better visual quality than the Mystique due to proper translucency/alpha blending support)
Oh, plus there's the (non-rendering-related) issue of the Voodoo I's tendency for mediocre analog video output with issues with sharpness and/or blotchy bar/line artifacts in the output. (evident in the your recordings too)
Optimization for S3's special 4-bit (non-paletted) texture formats might also have been significant. Albeit, support for more typical 4-bit paletted textures would have been more useful. (be it either using an offset to select 15/16 colors from the on-chip CLUT, or pointing to external 15/16-color tables and quickly loading per-texture)
It's rather odd they went with that proprietary interpolated 4-bit color scale format (2 indexed color values with 15 shades/hues interpolated between) rather than just implementing a paletted 4-bit texture system. (or perhaps adding the special interpolated 4-bit format in addition to 16 color palette support, but on its own, the interpolated shades are only useful in specialized situations, like certain types of terrain, water, or sky textures)