VOGONS


S3D GLQuake - any real value in it?

Topic actions

Reply 20 of 31, by CBM

User metadata
Rank Newbie
Rank
Newbie
BinaryDemon wrote:
So this guy here is running GLQuake on a P3 1ghz with Win98 and his S3 Virge GX is at 400x300: […]
Show full quote

So this guy here is running GLQuake on a P3 1ghz with Win98 and his S3 Virge GX is at 400x300:

https://youtu.be/llYTUPUEPBE?t=805

He doesn't do a benchmark or anything but it seems playable.

But yes, not sure why you would choose this over a software renderer - like WinQuake.

nice! when I get some retro systems running again with s3 virge cards, then I will definetly try to run quake 1 on them 😁

vetz wrote:

The Virge was designed back in 95 and the landscape was completely different then. Also it was never intended for highend users. If you look at all the early cards no-one really knew who would win the 3D war and how it would turn out. Just think of the failure with Nvidia's first card! Also in late 96, many thought Rendition Verite would win since it had Quake and was a better value proposition than the Voodoo. In the end it turned out that raw performance was what counted.

The various players should have realised that back then. On the other hand, nVidia is now repeating that lesson with their "raytracing" cards 😁

Main PC SPECS:
CPU: AMD Ryzen 5 2600
GPU: Powercolor Red Devil Radeon RX 5700 XT
RAM: 8GB*4 Corsair Vengeance LPX DDR4 3200MHz
Motherboard: ASUS Prime B450M-A
PSU: Corsair RM850

Reply 21 of 31, by kixs

User metadata
Rank l33t
Rank
l33t

Virge should run playable FPS in low resolution (320x240) but with added filtering and 16-bit colors. So you get some benefits over the standard versions. But back then I always tested with 640x480 minimum and got poor results - decelerator 🤣

Requests are also possible... /msg kixs

Reply 22 of 31, by 386SX

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

S3D's GLQuake MiniGL used the S3D Toolkit library.

Interesting but did it run in a "direct" low level way like a Glide version? Or more like a generic Opengl to S3D wrapper?

Anyway it's still impressive to see that game with those gfx effects running on those Virge cards.

Reply 23 of 31, by shamino

User metadata
Rank l33t
Rank
l33t

I never knew there was a way to run Quake with S3D. I had an original Virge (whatever it's called) back then with a Cyrix 6x86 133MHz. Quake is the game that made that computer feel obsolete, and then I had occasions to run other less important 3D games which performed just as badly.
The only game I ever used S3D acceleration with was Destruction Derby, which came with the card. It was faster in that mode, but still not all that smooth. The acceleration also broke when I upgraded Direct3D. I think I had to run Direct3D v2 for acceleration to work. And so I did. When acceleration was broken the game got slower, so yes the card was an accelerator, but certainly not a good one.

I remember reading an article comparing graphics cards and it was obvious the Voodoo was in a class of it's own, but I thought it was way too expensive to take seriously. A 2D card plus Voodoo was going to cost something like $400-$500. I thought there was no way people would spend that much just for faster 3D games.
I figured the market would be dominated by single cards with some 3D ability, and Voodoo cards would be an expensive, unsupported oddity.

S3's Virge grabbed a lot of market share, but it ended up being the unsupported chip because it was just that hopeless.

Reply 24 of 31, by diagon_swarm

User metadata
Rank Newbie
Rank
Newbie
feipoa wrote on 2019-10-07, 08:23:

(...) the take away message for me was the comment that Quake 2, with the S3 Virge GX and a 1 GHz Pentium, plays about the same as a Pentium 166 in software mode. This card seems like a fairly impressive decelerator. Anyone here actually use it for 3D "acceleration" back in the day?

I'm not sure that you got the point. The card is pixel fill-rate limited so the 1GHz CPU doesn't give you any additional frame per second in comparison with something like 266MHz Pentium II (a CPU that OEMs often used to combine with Virge GX/GX2). I used a 1GHz P3 CPU just to be sure that this is the best fill-rate performance you can get out of the card (yes, I'm the guy from the video).

feipoa wrote:

Am I doing something wrong? Why would there be an S3 Quake minigl if it is slower than software mode?

Nope, you did it right. There is no benefit for you to use DX wrapper for GLQuake. You have too slow CPU.

Basically, CPU handles all the game logic (position of all items/enemies, collisions...) and 3D scene culling (BSP trees...) in the game - this is the same for both software rendered and accelerated version. The difference is in further rendering stages:

- Geometry processing (2D projection of the 3D scene)
- Triangle setup
- Drawing the calculated polygons

The software version of Quake does all parts in a heavily optimized way that is tailored for the level of precision and type of rendering that is used by the game. Nothing more, nothing less. On the other side, the 3D accelerated version uses libraries that can be used for the game as well as CAD software. These libraries are more versatile, can do more and need more performance on the hardware side. Just try running GLQuake with the Microsoft's OpenGL software renderer to see what I'm talking about (you will see the performance far lower than with the internal software mode... even if texture filtering is disabled).

So...
- You use universal graphics libraries to render the game (higher CPU overhead)
- In addition to that, there is a wrapper translating all calls to another graphics library (additional CPU overhead)

Virge is a very simple chip accelerating just the rasterization part of the rendering. It does not have a geometry unit (typical for any card of that era outside hi-end CAD segment) and it does not have even a triangle setup engine (like ATI Rage I/II and other some other early cards). That means that all triangles in the projection must be split into spans based on slope data for each axis (X, Y, Z, A, U, V) and these slopes must be calculated by CPU... and again, it is done using a universal way not tailored for the precision required by this particular game (additional CPU overhead).

Although the Virge DX/GX/GX2 has low textured pixel fill-rate performance (btw all three chips have the same performance per MHz; the only difference is between EDO and synchronous memory chips... surprisingly, EDO version is always faster on the same clock), with the 133MHz Pentium (not a crippled one), you cannot fully utilize the card. The frame rate is limited already on the geometry and triangle setup so lowering the resolution doesn't help much.

I've tested different combinations of 3D accelerators and CPUs to be sure that Pentium 133MHz is not powerful enough. Early Direct3D APIs have high CPU overhead, which gives a big performance advantage for 3Dfx Voodoo Graphics if Glide is an option. With the slow CPU, you should at least use a 3D accelerator that has hardware triangle setup engine.

Viper Racing
512x384 / P133
5.4 fps Rage II+
8.7 fps Virge GX
31.3 fps Voodoo2

512x384 / P166MMX
7.9 fps Rage II+ (+46%)
15.9 fps Virge GX (+83%)
41.6 fps Voodoo2 (+33%)

I wrote more about it on my blog: https://swarmik.tumblr.com/post/184625185139/ … -cpus-1997-1998

Vintage computers / SGI / PC and UNIX workstation OpenGL performance comparison

Reply 25 of 31, by diagon_swarm

User metadata
Rank Newbie
Rank
Newbie
feipoa wrote on 2019-10-07, 08:23:

This guy was really dedicated to seeing what the Virge GX could do in 3D.

I just wanted to show that these cards were usable for 3D acceleration back then. The video shows different games from 1996 to 1999 so it is not surprising, that the newer games are not always rendered properly or require lower settings on this old budget card.

feipoa wrote:

Anyone here actually use it for 3D "acceleration" back in the day?

Yes, sure. Many Direct3D games from 1997 run well on GX/GX2. The issue here is that most people today test things like 3D Mark 99 and AAA games from 1999 and they are disappointed with the results. I had Virge GX2 in a P2 266MHz system and played 3D accelerated games - mostly in 512x384, some in 640x480. I remember Croc, Interstate 76, Nightmare Creatures, Midtown Madness, Motoracer, Fighting Force and many others. The card was usable in 1998 (not for some games) but definitely not in 1999.

Although GLQuake is also from 1997, its use of lighmaps doubles the pixel fill-rate requirements (many pixels in the frame are rendered 4-6x) on anything else than software rendering or Rendition Verite (vQuake uses different approach). Therefore, it is not a typical example of a game from that era. Check the Carmack's plan file where he talks about 3D accelerators and fill-rate performance- https://fabiensanglard.net/fd_proxy/doom3/pdf … c-plan_1996.pdf (page 26). You can see that even many professional cards didn't have fill-rate performance of the 3Dfx Voodoo Graphics chipset. 3Dfx decided to use a large on-chip texture cache and wide memory bus... and their decision was the right one for games. However, such a solution was not possible on cheap cards due to its high cost. Quake required fill-rate of 3Dfx Voodoo Graphics to run at 640x480@30fps... given this, you should not be surprised that any cheap card released at the same time as the first Voodoo is slower. Even many $10.000-20.000 3D/UNIX workstations were not able to render 640x480@30fps in GLQuake.

kjliew wrote on 2019-10-14, 03:13:

How irony as such was probably Matrox business plan with Mystique, and Matrox was so badly criticized for leaving out bilinear filtering and some blending modes for speed and taking for granted that everyone should be happy with 3D as soaked up SVGA. 🤣 S3 business plan for ViRGE was "3D is just a checklist as long as we have it, too". Traditional PC graphics players were mostly betting against the onset of 3D revolution in PC game industry. They did not believe the sparks would turn into fires and they got burned, ATI survived.

Games accelerated by Matrox Mystique looked worse than with software rendering. Lack of the bilinear filtering was not the biggest issue. Lack of any alpha blending ruined gaming experience in many 1997+ games because even software renderers used alpha blending for many effects and these were not possible. It was always better to buy a 2D card instead of Mystique and use the money on better CPU.

Nightmare Creatures (enjoy the smoke effects on Matrox...)

mga2-512x384.png
Filename
mga2-512x384.png
File size
200.58 KiB
Views
944 views
File license
CC-BY-4.0
virge-gx2-512x384.png
Filename
virge-gx2-512x384.png
File size
233.06 KiB
Views
944 views
File license
CC-BY-4.0

As Putas already said, 3D was an important topic for S3. Virge is not entirely bad chip. I benchmarked many early 3D accelerators to understand how fast they are in different situations (blending, textures, filtering, smooth-shading...).... not only PC cards, but also cards used in professional workstations. See http://swarm.cz/gpubench/_GPUbench-results.htm and http://swarm.cz/gpubench/

Fill-rate performance (textured and untextured) of Virge GX is on the same level that Sun had about a year earlier in its Creator3D workstation that cost about $30.000. Especially good is in drawing untextured smooth-shaded objects, which could be important for possible VRML revolution and other visualization tasks. Several pre-1996 games rendered many objects in the scene without textures, sky was often blitted outside 3D rendering pipeline to save some performance. Based on that, it looked that the performance might have been good enough. It was not after a short time but this was not an easy task to predict.

Vintage computers / SGI / PC and UNIX workstation OpenGL performance comparison

Reply 26 of 31, by Stiletto

User metadata
Rank l33t++
Rank
l33t++
diagon_swarm wrote on 2020-01-02, 19:16:

I just wanted to show that these cards were usable for 3D acceleration back then.

Wow, nice benchmarks! Great job.

"I see a little silhouette-o of a man, Scaramouche, Scaramouche, will you
do the Fandango!" - Queen

Stiletto

Reply 27 of 31, by leileilol

User metadata
Rank l33t++
Rank
l33t++

There's a certain aesthetic appeal of the early Matrox stippled alpha though since they were the only one doing it. SEGA Model 1/2 fans would love the thing all the way up to G100A, and also Matrox has an obvious performance/image quality advantage...

ViRGE's alpha textures cannot modulate unfortunately so fading smoke in later games will have to be a texture sequence (or a simpler effect like some kind of polygon disc of only vertex colors and alpha). Don't know if Mystique can modulate alpha off the top of my head, but g100 does. The other blending deficient PowerVR1 cards can modulate alpha too

Lightmapped games are probably out of the question for either card since they all require a subtractive or dest/source color blending function to look properly. Quake2's powervr and permedia support required reprocessing the lightmap data as an alpha texture...

apsosig.png
long live PCem

Reply 28 of 31, by diagon_swarm

User metadata
Rank Newbie
Rank
Newbie

You are right - Virge cannot modulate alpha textures, but that was common to many cards of that era (even Rage Pro cannot do it). I didn't care much when smoke just didn't disappear smoothly. The worst issue with newer games was when the S3 driver misinterpreted the transparency and ignored any alpha on the polygon (more transparent = more black). S3 could have handled this much better (ATI and some others did) but they already had Savage so they didn't care... and game developers didn't care about Virge either.

Lightmapping works well only for monochrome lighmaps (GLQuake and Quake2-based games when S3D wrapper pretends to be Permedia1), but due to the low fill-rate performance, you cannot play much beyond GLQuake on GX2. Anyway, lighmapped games were not so common - many 3D-accelerated 90s games used vertex lighting instead (even 1999 games like Aliens vs. Predator).

Btw I remember when I switched from Virge GX2 to Rage Pro and then to Riva 128. Virge had better visual quality than the other two in many ways.

---

Not sure about modulate/alpha on Mystique. On vintage3D, I see that some games used different patterns based on the transparency level, but many games didn't. Nightmare Creatures (1997), the game I played yesterday not only to upload the screenshots 😀 , used modulate/alpha on the smoke polygons and both cards were not able to render it properly - they just decreased the brightness of the alpha-blended polygon. The transparency was done always using 50% on-off-on-off on Mystique on all polygons that used alpha.

Btw the game runs fine ~30fps on both cards. Virge GX2 can run it in 512x384 with all the transparency and texture filtering. Mystique can run it in 640x480 without all these effects. I would always prefer to play it on Virge GX2 - for me, the slightly higher resolution cannot justify the missing effects.

Vintage computers / SGI / PC and UNIX workstation OpenGL performance comparison

Reply 29 of 31, by misterjones

User metadata
Rank Member
Rank
Member
Phido wrote on 2019-10-12, 02:15:

Think of it as a slower Matrox Mystique. For earlier Directx 5 and earlier games, and 1994/5 games.

I remember early tests of the Virge, Mystique, Voodoo, and other early attempts at 3D and the one thing that the Virge always had going for it was that the 3D actually looked good. In fact, I remember one such test from some magazine stating that the 3D quality compared to the original Voodoo and depending on the game it was much better looking. That really didn't matter because it was as slow as a turtle shitting molasses.

The Mystique, however, was typically at or near the top of the speed charts, but it was visually terrible (like really, really bad looking 3D) unless the game used the native API and was done by fairly competent programmers (like MotoRacer seemed to be).

I wouldn't call the Virge a slower Mystique because 3D on the Virge actually looked good.

Reply 30 of 31, by Phido

User metadata
Rank Newbie
Rank
Newbie
misterjones wrote on 2020-01-08, 08:15:
Phido wrote on 2019-10-12, 02:15:

Think of it as a slower Matrox Mystique. For earlier Directx 5 and earlier games, and 1994/5 games.

I wouldn't call the Virge a slower Mystique because 3D on the Virge actually looked good.

That is true. I do recall in a few games that programmed around the Virge limitations, it was pretty. True colour, bright colours (ha 3dfx).
I was hoping to be favorable to the Virge by comparing it to the Mystique. My Virge cost about $50 new, when a Mystique still sold for $200+. 4Mb SGRAM. Fast 2D, video acceleration, texture filtering, a sort of alpha, etc.

The Virge was never designed to play Quake. It was not designed around OpenGL. But there were plenty of car racing, 3rd person view, shoot em ups that played just fine.
S3's Savage3d and Savage4 were also chips that offered really high quality image. In many ways superior to Nvidia or ATI and much better than 3dfx. But on nearly all of these you couldn't trade quality for speed. For example on S3's later cards trilinear filtering was "free". Quite often the S3 card had such a minor drop in 32bit colour you might as well play in 32 bit colour.

The Savage3D turned up in 1998. So while the Virge was still sold beyond that date, it was bargain basement special of an already cheap card. Virge had been on the market since 1995. Its design predated the release of win95. Its design predated the founding of 3dfx.

Reply 31 of 31, by swaaye

User metadata
Rank l33t++
Rank
l33t++

It's probably not that much different than Verite V1000 in the end. V1000 only ran Quake because Rendition and id put a ton of effort into writing a renderer specifically to work around the limits of the chip. It isn't really capable of running GLQuake.

S3 didn't have that kind of attention for Quake, but it did with some other games.