VOGONS


Voodoo 2 SLI on AMD K6-2+ Dos / Win9x Gaming

Topic actions

Reply 40 of 64, by wysiwyg

User metadata
Rank Newbie
Rank
Newbie

I hope I'm asking my question in the correct place, even though I'm not on an AMD system.

So I have finally found a second Voodoo 2 to put in my P1 233MMX system. I tried to run GTA in DOS, but the game says "What 3Dfx card?!" and gets back to the main menu. It's working fine with either one of the Voodoo's, but not with both. Does anybody know if I need a different Glide driver for DOS or the game just doesn't support Voodoo's in SLI? With one Voodoo 2 the maximum game resolution is 800x600, I was hoping that it'll get it to 1024x768 with SLI.

My PCs / My series collection

Reply 41 of 64, by vetz

User metadata
Rank l33t
Rank
l33t

Do you run two identical cards in SLI? SLI with two cards that are not of the same brand/type only works with the Fastvoodoo 4.6 Windows drivers. I'm pretty sure you will have to disable SLI in that case for DOS games with a batchfile.

Anyway GTA does not support 1024x768, so no point running this game in SLI mode anyway.

3D Accelerated Games List (Proprietary APIs - No 3DFX/Direct3D)
3D Acceleration Comparison Episodes

Reply 42 of 64, by wysiwyg

User metadata
Rank Newbie
Rank
Newbie

Yeah - they're the same brand/type.

Do you mean that GTA doesn't support 1024x768 with Glide in DOS? Because I'm pretty sure I've ran it at that resolution at some point years ago. I can't remember on what card, probably under Windows, maybe on Voodoo 5 5500.

My PCs / My series collection

Reply 43 of 64, by vetz

User metadata
Rank l33t
Rank
l33t

When I tested GTA on my Voodoo 2 SLI setup, 800x600 was the highest resolution possible. I didn't get any higher when testing on the Voodoo 3 either.

I did not get any errors when starting the game, so it seems a bit strange tbh. Maybe try a different glide2x.ovl file?

3D Accelerated Games List (Proprietary APIs - No 3DFX/Direct3D)
3D Acceleration Comparison Episodes

Reply 44 of 64, by m1so

User metadata
Rank Member
Rank
Member
n3xu5 wrote:

Question 1: I've read that glide isn't smooth frame rate by today's standards does it appear jerky or what you might see at 10-20 fps on some games?

You are going to be very satisfied. You'll probably get 30-100 fps with a single Voodoo 2 card and more with 2 of them in SLI, hell, the Voodoo Banshee was a cutdown version and here are the benchmarks http://www.tomshardware.com/reviews/3d-chips,83-2.html . Stick to 640x480 or 800x600 through. You aren't going to get 10-20 fps with a Voodoo 2 SLI, not even with a Voodoo Rush which was the slowest Voodoo ever (slower than Voodoo 1).

A Pentium 200 MMX gets 200 fps in Descent 2 with 3dfx patch and that is with Voodoo 1.

Don't confuse 3dfx cards for a SuperFX chip or a N64 running Ocarina of Time, you are going to get framerates that many modern gamers would envy as long as you don't try things like running Need for Speed Underground 2 on it.

One of the selling points of Voodoo 2 in fact was the stable 60 fps it often produced.

A modern console game is 30 fps often falling to 20-24 fps so the speed of the old Voodoos is all the more admirable.

Last edited by m1so on 2013-07-02, 22:56. Edited 1 time in total.

Reply 45 of 64, by Jorpho

User metadata
Rank l33t++
Rank
l33t++
m1so wrote:

A Pentium 200 MMX gets 200 fps in Descent 2 with 3dfx patch and that is with Voodoo 1.

Did you throw an extra zero in there..? 200 fps would exceed the refresh rate of most monitors.

Reply 46 of 64, by m1so

User metadata
Rank Member
Rank
Member
luckybob wrote:

That being said, the biggest if not only reason 3dfx was popular was the fact they were FIRST. The only debatable difference was how they did anti-aliasing. Everyone but 3dfx took a inferior shortcut way of doing it and 3dfx took the hard way. the hard way was "better" but slower.

The moral of the story is, Build what you want. As long as your are happy with how it comes out, that's all that matters.

The reason why 3dfx was popular was that it was THREE TIMES or more as fast as the competition. The first "3D accelerators" were Matrox Impression Plus, ATI Rage, Diamond Edge 3D (Nvidia NV1) and S3 Virge 325 and all of these 4 sucked so absolutely unbelievably that it is a wonder anyone would even buy them (except for viewing movies on shitty PCs in case of ATI Rage as it has an MPEG decoder, its only redeeming feature).

It was not the first 3D card to not suck donkey butt either, as Rendition Verite takes this honor. 3dfx Voodoo however was twice as fast as the Verite, 3x as fast as NV1 and perhaps up to 10x as fast as Rage/S3 Virge 325.

Besides, the first Voodos nor the V2 did antialiasing. You are thinking of texture filtering, which most early 3D cards either totally lacked (Matrox Mystique, which was otherwise an okayish card, especially compared to the likes of Virge 325 and Rage) or implemented so slowly that it slowed the system down to 5 fps (Virge 325/Rage).

Last edited by m1so on 2013-07-02, 22:47. Edited 1 time in total.

Reply 47 of 64, by m1so

User metadata
Rank Member
Rank
Member
Jorpho wrote:
m1so wrote:

A Pentium 200 MMX gets 200 fps in Descent 2 with 3dfx patch and that is with Voodoo 1.

Did you throw an extra zero in there..? 200 fps would exceed the refresh rate of most monitors.

Nope, I didn't. Hell, it exceeds 30 fps on Pentium 120s in SOFTWARE mode (in VGA).

Elianda tested 3dfx Descent 2 on a 112.5 Mhz K5 (PR166) with a Voodoo 3 and here I quote the results:

elianda wrote:

Ok I tried Descent2 3dfx started from Win98SE and in normal game it caps at 60 fps.
If I run the demo however in very intense combat it drops to 32 fps minimum fps, with not much action it is well over 100 fps.

He gets 30-100 fps on a CPU that is quite much slower than a 200 Mhz Pentium MMX, and the Voodoo 3 is CPU limited here so the result would be the same on a Voodoo 1, this CPU does not max out even a Voodoo 1 let alone a 2 or 3.

In gameplay it is capped to 60 fps, but that is for a reason. Descent 2 behaves batshit when it exceeds much over 60 fps.

And refresh rates never stopped gamers from getting 300 fps in Quake benchmarks, now, one might argue about the usuability of such excessive framerates, but the fact is, Voodoos really were powerhorses.

Reply 48 of 64, by m1so

User metadata
Rank Member
Rank
Member

To quote some fps results from the test I linked, well, a Voodoo 2 SLI gets 246 fps in Forsaken 640x480, 192 fps in 800x600 and 116 fps, and I think that's not even Glide, but D3D. In Turok you get 173 fps in 640x480 in Glide and 112 fps in 800x600 in D3D. In Incoming you get 100 fps for 640x480, 98 fps in 800x600 and 70 fps in 1078x768. Quake 2 gets you over 90 fps and the most "modern" game on the list, Sin, gets over 40 fps in all resolutions. If these are choppy then I am the Queen of England. But then, V2 SLI was the GTX Titan of its day, so it would be kind of silly to expect "10-20 fps".

These results are from a Pentium II 400 Mhz, but as the benchmark shows the results are not much worse on a L2 cache absenting 266 Mhz Cereron, which is probably slower than your K6-2.

Reply 49 of 64, by elfuego

User metadata
Rank Oldbie
Rank
Oldbie
m1so wrote:

These results are from a Pentium II 400 Mhz, but as the benchmark shows the results are not much worse on a L2 cache absenting 266 Mhz Cereron, which is probably slower than your K6-2.

Nope. Celeron 266 is pretty much faster. Especially if overclocked. 😒

Reply 50 of 64, by m1so

User metadata
Rank Member
Rank
Member

The test was obviously done with the stock one. Besides, I don't know anyone who actually overclocked their CPUs. And I think you seriously underestimate the shittiness of Covington cacheless celerons. Those things were weaker than a 233 Mhz Pentium 1.

Reply 51 of 64, by Hatta

User metadata
Rank Member
Rank
Member

I've never seen 3d graphics as smooth as on my voodoo 2 SLI rig. More modern cards can push as many frames and do things like anti-aliasing, but they don't look as smooth. Maybe the deltas in between frames are more even on my Voodoo cards. Or maybe I'm just seeing things.

Reply 52 of 64, by Soupdragon

User metadata
Rank Member
Rank
Member

One thing that can help with the smoothness of FPS games in Win9x using a PS/2 mouse is to up the sampling rate of the mouse. Use a program like this: http://www.softpedia.com/get/System/OS-Enhanc … s/PS2Rate.shtml

I was just playing some Quake 2 with the 3dfx mini gl on my sli voodoo 2 and its the definition of smooth.

Steam | World of Warcraft

Reply 54 of 64, by m1so

User metadata
Rank Member
Rank
Member

It might be because of good frametimes on 3dfx cards. Many modern cards, especially in SLI/Crossfire, have fantastically bad frametimes. So say you have a 60 fps, but the times between the frames are each wildly different. This can cause a thing called microstutter. Also, some games for example Fallout 3 and Oblivion make movement speed dependant on framerate, which means anything under or over 60 fps is either slowed down or sped up. This is also why v-sync is a good idea in many games.

Reply 55 of 64, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
m1so wrote:

It might be because of good frametimes on 3dfx cards. Many modern cards, especially in SLI/Crossfire, have fantastically bad frametimes. So say you have a 60 fps, but the times between the frames are each wildly different. This can cause a thing called microstutter. Also, some games for example Fallout 3 and Oblivion make movement speed dependant on framerate, which means anything under or over 60 fps is either slowed down or sped up. This is also why v-sync is a good idea in many games.

Scanline interleaving is butter smooth but prone to anomalies too. Disable vsync on Unreal and try to make quick turns with the mouse. See how the image divides into two

7fbns0.png

tbh9k2-6.png

Reply 56 of 64, by m1so

User metadata
Rank Member
Rank
Member

True, but you always get tearing with vsync off if you get more fps than your refresh rate. The refresh rate of good CRTs was usually 75-85 Hz through, so tearing was not as frequent as it is today on 60 Hz LCDs.

Reply 57 of 64, by swaaye

User metadata
Rank l33t++
Rank
l33t++

V2SLI is gonna be really limited by any K6 though. You ideally want a P3 or Athlon.

Quake 2 has perhaps the best 3dnow optimization available if you use the AMD-written patch that even includes a tweaked Voodoo2 minigl, but still it doesn't exactly scream on K6. The minimum frame rate (when there's a lot of action) is the biggest issue.

Reply 58 of 64, by vetz

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

V2SLI is gonna be really limited by any K6 though. You ideally want a P3 or Athlon.

Quake 2 has perhaps the best 3dnow optimization available if you use the AMD-written patch that even includes a tweaked Voodoo2 minigl, but still it doesn't exactly scream on K6. The minimum frame rate (when there's a lot of action) is the biggest issue.

While it is true that V2SLI is limited on a K6 platform, I do not agree that Quake2 isn't screaming along. These benchmarks are from my 66mhz FSB K6 system. A proper SS7 board will get even better results:

GLQuake (demo1):
AMD K6-2 400mhz - 1024x768 SLI - 40.8 fps
AMD K6-3 400mhz - 1024x768 SLI - 57.6 fps

Quake 2 (demo1):
AMD K6-2 400mhz - 1024x768 SLI - 57.4 FPS /w 3DNow driver
AMD K6-3 400mhz - 1024x768 SLI - 67.8 FPS /w 3DNow driver

I know these numbers are average and don't show minimum frames pr. second, but they never drop below anything that is noticable.

3D Accelerated Games List (Proprietary APIs - No 3DFX/Direct3D)
3D Acceleration Comparison Episodes

Reply 59 of 64, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
m1so wrote:

True, but you always get tearing with vsync off if you get more fps than your refresh rate. The refresh rate of good CRTs was usually 75-85 Hz through, so tearing was not as frequent as it is today on 60 Hz LCDs.

Yes but that introduces at least a minimum of input lag unless you use triple buffering

7fbns0.png

tbh9k2-6.png