VOGONS


3D Accelerator Video Captures

Topic actions

Reply 40 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
swaaye wrote:
F2bnp wrote:

Excellent videos! We need more old graphics hardware captured on video!
How about some of the early (or late) ATi Rage cards?

I have a ATI Rage 128 Pro Ultra card. That's the second generation Rage 128 chip.

I have a Rage Pro PCI card I could test at some point, and Rage II PCI cards are very common/cheap around here (probably on Ebay too, but Weird Stuff has a ton of them -probably the most common PCI accelerator after ViRGE based stuff), so I might pick up one of those eventually to try out. (if I get around to it, don't expect any results particularly soon though 😉)

From personal experience playing games on a K6-2 300 and 500 with the Rage Pro, it tended to work quite well for late 90s games (Rogue Squadron, X-Wing Alliance, and Episode 1 Racer all ran decent to good iirc).

kool kitty89 wrote:

What model ViRGE chipset was used for that? The S3 games look quite good, at least compared to the poor reputation that card has. (a bit framey, but with full effects/filtering/perspective correction on and what looks like 640x480 res . . . so a playable framerate with visual quality miles better than the software renderer or Playstation -or Saturn- versions)

Yeah the image quality is very good and the framerate is adequate for the time. The card is a Diamond Stealth 3D 2000 with 4MB EDO DRAM. I also have a STB Nitro 3D (Virge GX with 4MB EDO) that I plan to try out with these S3D games.

How many games allowed custom detail adjustment for ViRGE? With detail turned down to near-Matrox levels, it should have been much faster. (this was also applicable to RAGE chips since the fillrate dramatically increased with filtering disabled) -And aside from speed, the option to disable dithered shading/blending would be nice too. (some people prefer the posterized look over dithered graininess -a different case than the Matrox dithering mind-you)

Even without filtering, you'd still have nicer shading/color than software rendering, and obviously the difference would be far more substantial with slower CPUs. (the likes of the ViRGE should be much more fillrate limited and less CPU-speed limited, so slower CPUs that struggle with software rendering could see a dramatic improvement in speed even with full effects enabled, whereas a fast CPU could be faster in software)

Is the Direct3D support for ViRGE as poor as many reviews/articles imply, or is that exaggerated too? (I see similar comments about the RAGE chips, but I never experienced those problems myself -then again, I had my dad's help with tweaking the system)

Reply 41 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++
kool kitty89 wrote:

How many games allowed custom detail adjustment for ViRGE?

Is the Direct3D support for ViRGE as poor as many reviews/articles imply, or is that exaggerated too?

Terminal Velocity has limited detail settings and only one resolution. Tomb Raider has 512x384 available and a few settings that might get it quite smooth. I haven't tried Descent II yet.

D3D is not very usable with Virge 325. Some D3Dv3 games might be ok, but usually it is very slow and buggy. Maybe if you take the resolution way down to 320x240 or so, if possible...

Virge GX/DX on the other hand are decent with a few D3D5 games. Jedi Knight runs rather well, for example. I think it's similar to Verite V1000 in speed. But it is also again very buggy with most games. S3 never could write drivers.

Reply 42 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
swaaye wrote:

D3D is not very usable with Virge 325. Some D3Dv3 games might be ok, but usually it is very slow and buggy. Maybe if you take the resolution way down to 320x240 or so, if possible...

At 320x240, you'd still be as good (resolution wise) as most console games of the time at least and with visual quality similar to the N64. (or at least better than the Playstation if you disabled filtering -since you'd still have perspective correction)

So if you just wanted a decent low-cost general-use PC with competitive graphics compared to console standards of the time, the Virge wouldn't be too bad . . . aside from actual bugs.

Virge GX/DX on the other hand are decent with a few D3D5 games. Jedi Knight runs rather well, for example. I think it's similar to Verite V1000 in speed. But it is also again very buggy with most games. S3 never could write drivers.

And apparently they weren't willing to outsource driver development to more capable programmers. (or provide documentation to allow quality 3rd-party drivers -which could include Open GL and/or proprietary APIs like BRender)

I've seen some scathing reviews of Rage cards with reference to poor performing and/or buggy drivers, but I don't remember my family having any such problems with the Rage Pro. (then again, at the time, my dad tweaked and patched that quite a bit, including getting the unofficial beta DVD drivers working -only the AGP version officially supported DVD playback)

If nothing else, the Rage series definitely supported a much broader range of APIs than ViRGE. (Open GL support was a bit late but ended up pretty decent from what I remember -Unreal and Quake 2 ran fine, Direct3D worked decently well, BRender was supported, and apparently there was RenderWare support too -though I don't think we ever used games with that)

For general testing, Tomb Raider II seems pretty flexible. The DirectX renderer options allow a wide range of detail settings with resolutions from 320x200 to 1440x900, 16 and 32-bit color, the ability to disable 16-bit (non-indexed) textures, enable/disable z-buffering, enable/disable dithering, enable/disable perspective correction, bilinear filtering, and a bit more. And supports fullscreen and windowed modes. (a shame more games didn't offer that sort of flexibility)

The free demo version also supports all of those features too, so if you don't own the game, that should be easy to find. (and faster and easier than finding a download of the full version -and legal 😉)

And another note on the previous dithering-related comment:
Dithering in Rage/ViRGE/many other/later cards is used for smoother shading/blending with interpolation of 24-bit RGB (used internally with the GPU) to 16-bit RGB in the framebuffer. (opposed to simply truncating/rounding off to 16-bits) That takes added hardware and can even be slower than plain 16-bit rendering/truncation (depending on the GPU design) and is intentionally added to facilitate smoother blending/shading than is possible with 16-bit RGB and avoid ugly banding/posterization (the N64 and Playstation also have this feature).

Since some people find the grainy look (especially at low resolutions) uglier than posterization, so the added hardware is not only wasted but detrimental to the visual quality . . . so that would be a very important feature to allow to be disabled.
IIRC, the Voodoo cards lack this feature altogether, so it's a non-issue (you're stuck with posterization), though there seems to be a separate issue with desaturated or "milky" color with some Voodoo games. (at a blind guess, maybe it uses additive shading/lighting rather than multiplicative -which would definitely make for fast shading and save on chip space)

This is a totally separate issue from the dithering seen in early Matrox cards, since those actually lack support for alpha blending and rely on simple checkerboard dithering for primitive translucency effects instead (similar to what's used in some older DOS and console games -and some Sega Saturn games).
I've even seen some reviewers that lump the Rage/etc in with the Matrox for dithering and "not being able to alpha blend" when that's totally false.

Oh, and I forgot to mention earlier that Rage 1 based cards seem to be quite uncommon. I haven't seen any in the wild yet and they don't seem to come up on ebay either. (a fair number of Mach 64GX cards, but all Rage II based) But that would certainly be an interesting one to compare. (performance specs are apparently similar to the virge, but software support may be much better)

Last edited by kool kitty89 on 2012-03-09, 00:21. Edited 1 time in total.

Reply 43 of 185, by leileilol

User metadata
Rank l33t++
Rank
l33t++

well the Rage CAN'T alpha blend - the driver makes the CPU do that as fast as it can 😀
it can't modulate alpha either (not even by color) so stuff that fades will not, turok looks especially dreadful in it. Quake3 had to have an alternate pre-faded white smoke for the Rage Pro

apsosig.png
long live PCem

Reply 44 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
kool kitty89 wrote:
At 320x240, you'd still be as good (resolution wise) as most console games of the time at least and with visual quality similar […]
Show full quote
swaaye wrote:

D3D is not very usable with Virge 325. Some D3Dv3 games might be ok, but usually it is very slow and buggy. Maybe if you take the resolution way down to 320x240 or so, if possible...

At 320x240, you'd still be as good (resolution wise) as most console games of the time at least and with visual quality similar to the N64. (or at least better than the Playstation if you disabled filtering -since you'd still have perspective correction)

So if you just wanted a decent low-cost general-use PC with competitive graphics compared to console standards of the time, the Virge wouldn't be too bad . . . aside from actual bugs.

Virge GX/DX on the other hand are decent with a few D3D5 games. Jedi Knight runs rather well, for example. I think it's similar to Verite V1000 in speed. But it is also again very buggy with most games. S3 never could write drivers.

And apparently they weren't willing to outsource driver development to more capable programmers. (or provide documentation to allow quality 3rd-party drivers -which could include Open GL and/or proprietary APIs like BRender)

I've seen some scathing reviews of Rage cards with reference to poor performing and/or buggy drivers, but I don't remember my family having any such problems with the Rage Pro. (then again, at the time, my dad tweaked and patched that quite a bit, including getting the unofficial beta DVD drivers working -only the AGP version officially supported DVD playback)

If nothing else, the Rage series definitely supported a much broader range of APIs than ViRGE. (Open GL support was a bit late but ended up pretty decent from what I remember -Unreal and Quake 2 ran fine, Direct3D worked decently well, BRender was supported, and apparently there was RenderWare support too -though I don't think we ever used games with that)

For general testing, Tomb Raider II seems pretty flexible. The DirectX renderer options allow a wide range of detail settings with resolutions from 320x200 to 1440x900, 16 and 32-bit color, the ability to disable 16-bit (non-indexed) textures, enable/disable z-buffering, enable/disable dithering, enable/disable perspective correction, bilinear filtering, and a bit more. And supports fullscreen and windowed modes. (a shame more games didn't offer that sort of flexibility)

The free demo version also supports all of those features too, so if you don't own the game, that should be easy to find. (and faster and easier than finding a download of the full version -and legal 😉)

And another note on the previous dithering-related comment:
Dithering in Rage/ViRGE/many other/later cards is used for smoother shading/blending with interpolation of 24-bit RGB (used internally with the GPU) to 16-bit RGB in the framebuffer. (opposed to simply truncating/rounding off to 16-bits) That takes added hardware and can even be slower than plain 16-bit rendering/truncation (depending on the GPU design) and is intentionally added to facilitate smoother blending/shading than is possible with 16-bit RGB and avoid ugly banding/posterization (the N64 and Playstation also have this feature).

Since some people find the grainy look (especially at low resolutions) uglier than posterization, so the added hardware is not only wasted but detrimental to the visual quality . . . so that would be a very important feature to allow to be disabled.
IIRC, the Voodoo cards lack this feature altogether, so it's a non-issue (you're stuck with posterization), though there seems to be a separate issue with desaturated or "milky" color with some Voodoo games. (at a blind guess, maybe it uses additive shading/lighting rather than multiplicative -which would definitely make for fast shading and save on chip space)

This is a totally separate issue from the dithering seen in early Matrox cards, since those actually lack support for alpha blending and rely on simple checkerboard dithering for primitive translucency effects instead (similar to what's used in some older DOS and console games -and some Sega Saturn games).
I've even seen some reviewers that lump the Rage/etc in with the Matrox for dithering and "not being able to alpha blend" when that's totally false.

Oh, and I forgot to mention earlier that Rage 1 based cards seem to be quite uncommon. I haven't seen any in the wild yet and they don't seem to come up on ebay either. (a fair number of Mach 64GT cards, but all Rage II based) But that would certainly be an interesting one to compare. (performance specs are apparently similar to the virge, but software support may be much better)

Reply 45 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
leileilol wrote:

well the Rage CAN'T alpha blend - the driver makes the CPU do that as fast as it can 😀
it can't modulate alpha either (not even by color) so stuff that fades will not, turok looks especially dreadful in it. Quake3 had to have an alternate pre-faded white smoke for the Rage Pro

Really? . . . that's weird, but since we usually used it with a fairly fast CPU, I suppose that problem could have been hidden. (but that wouldn't explain the dithering -the artifacts are clearly dithered interpolation for shading/blending and not checkerboard meshes, and that sort of interpolation would add a significant amount of overhead if it was handled by the CPU -unless all the examples I've seen are just shading and not blending, so for lighting/gouruad shading rather than translucency effects)

The hardware for basic additive blending (simple 50/50 RGB averaging) is really simple to include (The blitters/GPUs in the 3DO, Saturn, Atari jaguar, and PS1 all support it), so it would be really strange not to include it since they went to the trouble of supporting large texture caches, polygon rasterization, bilinear/trilnear interpolated texture mapping, or even gouraud shading. I could certainly see lack of support for proper alpha blending with a discrete alpha channel (on textels and/or in the framebuffer -the latter requiring 32-bit RGB anyway), but that's a different issue than simply supporting averaging.

In the case of the Matrox cards, the lack of even 50/50 blending is more believable since it lacks so many other features too (especially assuming perspective correction is handled by the CPU), but even then it would be pretty questionable as the amount of chip space needed should be tiny. (much less than multiplicative shading -and is simpler additive shading were used to save space on that end, then that very same logic could be directly re-used for translucent blending -which is exactly what the Jaguar and Saturn do . . . though the jaguar actually "cheats" and uses a custom colorspace to allow multiplicative-quality shading using fast and simple additive shading)

For similar reasons, additive blending wouldn't take much overhead to handle in software either, except that 15 or 16-bit RGB need 5 and 6-bit boundaries on data and CPUs don't natively cater to that (a lot of bit manipulation overhead, though use of MMX or software SIMD would help). 24/32-bit RGB doesn't have that problem (8-bit boundaries are well suited to CPUs), but then you still have the issue of more RAM/bandwith needed. (the same reason 8-bit color was favored over 16-bit for many software renderers -and with 8-bit color, you're using a palette so additive blending doesn't work at all . . . you'd be limited to using look-up tables, simple dithering, or omitting translucency alltogether -and/or using full-screen palette swapped color effects)

For many games of the time, simple 50/50 blending was sufficient, so resorting to software blending would have been excessive for any GPU that supported simple additive blending. (and given how simple the hardware is for that feature, any decent 2D accelerator -let alone 3D- of the mid 90s should have supported it . . . basic VGA/SGVA cards and low-end windows accelerators wouldn't have)

Without an alpha channel (on texture or framebuffer) you'd be limited to enabling translucency on a per-sprite, per-texture, or per-polygon basis (rather than per-textel or per-pixel), so that would be limiting in the same way it was on the Playstation, but would still include a wide range of effects used in late 90s games.

Reply 46 of 185, by leileilol

User metadata
Rank l33t++
Rank
l33t++
kool kitty89 wrote:

The hardware for basic additive blending (simple 50/50 RGB averaging) is really simple to include (The blitters/GPUs in the 3DO, Saturn, Atari jaguar, and PS1 all support it), so it would be really strange not to include it since they went to the trouble of supporting large texture caches, polygon rasterization, bilinear/trilnear interpolated texture mapping, or even gouraud shading.

You might like the PowerVR PCX2. It supports alpha blending, modulated alpha blending (though not very precise), and it DOESN'T SUPPORT ANY OTHER BLENDING other than alpha blending. That's right, a 3d card without additive. On top of that weird limitation it also supports 24-bit color

apsosig.png
long live PCem

Reply 48 of 185, by F2bnp

User metadata
Rank l33t
Rank
l33t

Guys, keep in mind that all of these cards on the videos run on insanely fast CPUs. The Virge and Mystique were always found in P133 and P166 machines for example. So yes, they were really slow.
Driver support was also a huge problem. Before I had a Voodoo 1 (that card was a revelation...😜) I had to use cards like these and most D3D 5 games and especially D3D 6 (like Shogo and Blood) would not display any textures. Everything was... white. I've tried them on many different machines, drivers, settings... Said games never worked properly and I never really understood what caused this (bad driver support probably 😜).

Reply 49 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
swaaye wrote:

Speaking of Tomb Raider 2:
S3 Virge 325 (Tomb Raider 2)

The speed is decent (looks about as fast as Tomb Raider I), but the odd black artifacts are definitely annoying.

And an interesting note on the early Tomb Raider games in general: they all use black shaded fade-out to hide pop-in, so cards lacking support for fogging (or poor at it) avoided that problem. (it also worked decently well in the software renderer too, in spite of the 256 color limitations -I think TRI did it better than II though, maybe due to a more limited texture colorselection and thus more available shades in the palette)

F2bnp wrote:

Guys, keep in mind that all of these cards on the videos run on insanely fast CPUs. The Virge and Mystique were always found in P133 and P166 machines for example. So yes, they were really slow.
Driver support was also a huge problem. Before I had a Voodoo 1 (that card was a revelation...😜) I had to use cards like these and most D3D 5 games and especially D3D 6 (like Shogo and Blood) would not display any textures. Everything was... white. I've tried them on many different machines, drivers, settings... Said games never worked properly and I never really understood what caused this (bad driver support probably 😜).

I'd have thought the opposite would be true:
Those weaker cards should have been more often the bottleneck than the CPU (ie faster or slower CPU would make less/no difference) while faster (fillrate/rasterization) GPUs would be more limited by CPU performance (for handling the 3D vertex and lighting calculations)

That-is, unless some of those early cards actually required CPU assistance for things like rasterization ("triangle set-up") or perspective correction. (in which case, the above would only be true with those features disabled)
Or maybe if some drivers specifically supported mixed/dynamic CPU/GPU loading depending on CPU performance. (though I rather doubt that)

Edit:
http://www.myrkul.org/virge.html
This old review supports my assumption of the ViRGE being the bottleneck, with Descent II running almost no different on a 180 MHz Pentium as 120 MHz (180 MHz is actually very slightly slower). Slower CPUs would be needed to show exactly where the bottleneck is (ie min CPU speed that achieves max ViRGE performance, but it definitely seems like less than 120 MHz)

Therefore, as I noted previously, the ViRGE (and other slow/early accelerators) would benefit a slower system far more than a fast one. (albeit, if a game supported lower detail modes with the ViRGE, that bottleneck could be widended and require faster CPUs to max out performance -full features enabled would be the slowest case)

leileilol wrote:
kool kitty89 wrote:

The hardware for basic additive blending (simple 50/50 RGB averaging) is really simple to include (The blitters/GPUs in the 3DO, Saturn, Atari jaguar, and PS1 all support it), so it would be really strange not to include it since they went to the trouble of supporting large texture caches, polygon rasterization, bilinear/trilnear interpolated texture mapping, or even gouraud shading.

You might like the PowerVR PCX2. It supports alpha blending, modulated alpha blending (though not very precise), and it DOESN'T SUPPORT ANY OTHER BLENDING other than alpha blending. That's right, a 3d card without additive. On top of that weird limitation it also supports 24-bit color

Hmm. OK, it seems I misused the term "alpha blending" in my post you originally responded to (hence the confusion of context). I should have simply said translucency (or specifically mentioned additive blending or averaging -though the term "additive blending" itself may be too vague as well -since it could refer to averaging colors of 2 pixels or textels or adding to saturation for things like explosions, fire, or lens flare effects -the only difference with averaging is that the color values are halved -by bit shift/truncation- either before or after addition -before addition if precision is limited and overflow must be avoided, or after if higher internal precision is supported).
There's also the related function of additive lighting effects. (for colored lighting or for general purpose lighting/shading if multiplicative lighting is not supported -or possible if its much slower)

And there's odd cases like the Sega Saturn's VDP1, which supports additive lighting (for flat and smooth shading) as well as simple 1/2 transparency, but not additive (saturation) blending.
(there's also the separate issue of artifacting on blended warped quads -since the polygons are warped and not rasterized, meaning overlapping pixels are drawn over eachother, making blending uneven -I think the 3DO has the same problem; and that's one of the reasons you see dithered mesh effects in Saturn games -the other being limits on how VDP1 and VDP2 graphics can be combined)

It's odd that the PVR supported full alpha blending but not simple additive blending (since full alpha effects would effectively need most/all of the same logic as simpler additive blending and averaging). The limit to 24-bit color would seem more reasonable since it would avoid the need to include logic for quickly handling packed 15/16-bit RGB and instead focusing on 24/32-bit rendering (all on 8-bit boundaries)

So, anyway, do the early (I, II, Pro) Rage cards (or ViRGE for that matter) support simple additive blending (and/or averaging)?

Reply 50 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++
kool kitty89 wrote:

The speed is decent (looks about as fast as Tomb Raider I), but the odd black artifacts are definitely annoying.

The artifacts occur with both the 98SE in-box drivers and the final S3 pack (1997 dates).

On speed, note that TR2 is running 512x384 whereas TR1 S3D was at 640x480.

Reply 52 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:

Do you have a Savage 4 Pro you could some recordings of?

Savage 2000 and Savage 3D will be getting attention but I don't have Savage 4.

I'd like to find a cheap Number Nine Revolution IV....

Reply 53 of 185, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:
sliderider wrote:

Do you have a Savage 4 Pro you could some recordings of?

Savage 2000 and Savage 3D will be getting attention but I don't have Savage 4.

I'd like to find a cheap Number Nine Revolution IV....

I found one. 😜

And a 32mb Number Nine Savage 4 Pro. They are in the mail. They should be here today or tomorrow.

Oh, and be sure to film the Savage 2000 showing all the bugs with the DX 7 driver.

Reply 54 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++

New addition - S3 Virge 325 (Mechwarrior 2 S3D)
http://www.youtube.com/playlist?list=PL995DE461767BFB71

sliderider wrote:

Oh, and be sure to film the Savage 2000 showing all the bugs with the DX 7 driver.

Oh it's not hard to find problems with Savage 2000. Even UT with S3 Metal has some problems.

Reply 55 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member

I was looking around at some other old reviews, and noticed some more positive/balanced reviews on the early Rage cards. One consistent thing mentioned seems to be that the value was best for those who actually needed the multimedia performance features (namely MPEG) or added RAM (on certain models), but there were also positive comments on software/driver support for the Rage II and Pro based cards.

One common misconception seems to be that several cards (including the Rage) were considerably "slower" than the Mistique, but comparisons were made at default detail settings (or games with limited/no detail options), thus putting more feature-rich cards in a different context. (almost all of which were much faster -especially in fillrate- with added effects/features disabled -especially filtering, mipmapping, and fogging)

I'm not positive on the exact performance levels, but it seems that (at similar detail levels) the ViRGE 325 wasn't nearly as slow compared to the Matrox as face-value would show, the Rage II was close to as fast (or somewhat faster), and the Rage Pro was definitively faster. (though still slower with full features enabled)

3DFX's chipsets seem to benefit far less from disabling features.

Reply 57 of 185, by leileilol

User metadata
Rank l33t++
Rank
l33t++

I added a bunch of PCX2 MiniGL torture videos.

http://www.youtube.com/watch?v=jHWK1lKtkXs
http://www.youtube.com/watch?v=M-Lci38TERY
http://www.youtube.com/watch?v=5j19i4q3WKo

The Techland MiniGL is more 'direct' as in no workarounds and gets a lot of games to work including MDK2! Unfortunately, no workarounds so ddon't expect blending subsitution

I should write a wrapper for a wrapper to fix that 😜

apsosig.png
long live PCem

Reply 59 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I've uploaded several videos of Savage 2000. 3DMark99-01, Quake 3 and UT. The Quake 3 video examines S3TL being enabled and disabled. The UT video examines S3TC texturing and the lack of functional detail textures.

See playlist link above.

S3TL can only be unofficially enabled by registry for OpenGL, from what I can determine. It slows Quake3 down compared to letting T&L be computed by the P3 @ 1050 MHz. Perhaps with a slow CPU there would be benefit, but unfortunately it has bugs that affect image quality.

The driver used was v9.51.17, with file dates from 2002. I also used S3 MeTaL v2.0.3.0. I will put a Savage 2000 file collection on VOGONSdrivers later.