VOGONS


Reply 120 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2022-03-27, 09:50:
That's why I'm asking actually. I don't think the 3DNow! 3DFXGL option can work on Voodoo3 and Banshee cards. For those, you nee […]
Show full quote

That's why I'm asking actually. I don't think the 3DNow! 3DFXGL option can work on Voodoo3 and Banshee cards. For those, you need to do the following:

  1. Extract the Quake 2 MiniGL file provided by 3dfx to your root Quake 2 directory (X:\Quake2\). The MiniGL is available for download at www.3dfx.com if you dont already have it.
  2. Rename the MiniGL file (3dfxgl.dll) to the following: opengl32.dll. Be sure to delete any previous opengl32.dll files that were present in your root Quake 2 directory before doing so.
  3. Extract the AMD Quake 2 3DNow! patch to your Quake 2 directory as documented in AMDs installation FAQ.
  4. Start Quake 2. Under the video options menu, choose 3DNow! OpenGL as your rendering device, not 3Dnow! 3dfxGL

That always worked for me, meaning I got a non-trivial performance boost on my Voodoo3 when using it with my AthlonXP. Those instructions are from the Anandtech article that I linked to earlier.

Oh... man. You are absolutely right! 😁
Now that you've mentioned it, I remembered... This is how I used to test it on VIA motherboards in the past. 😁

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 121 of 200, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2022-03-27, 09:32:

Good question, definitely worth testing! Though, based on past experience, I'm pretty sure that drivers which support the GeForce 4 MX (the first ones being something like detonator 28.xx, I believe) will be much slower on SS7 than 7.76.

That's what I'm most interested in.

A showdown between an MX400 using 7.76 drivers and (an 128-bit) MX440 using 40.72 drivers (for AGP 8x models) on a less powerful CPU. Will the increased power of the MX440 outweigh the driver overhead? Find out on the next episode of Dragonball Z Retro Shenanigans!

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 122 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2022-03-27, 10:04:

That's what I'm most interested in.

A showdown between an MX400 using 7.76 drivers and (an 128-bit) MX440 using 40.72 drivers (for AGP 8x models) on a less powerful CPU. Will the increased power of the MX440 outweigh the driver overhead? Find out on the next episode of Dragonball Z Retro Shenanigans!

🤣
Your wish is my command! Well... I guess it's time to take the Asus P5A out of the box... again. 😁

PS: I should really change the name of this thread, because... let's face it, the original purpose has gone down the drain 😀

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 123 of 200, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2022-03-27, 10:14:

PS: I should really change the name of this thread, because... let's face it, the original purpose has gone down the drain 😀

Heh, go for it. It might also be worthwhile to put up a spreadsheet with a graph which summarizes your test results. Maybe on Google Docs, or just as an attachment.

BTW, when benchmarking Unreal Gold and Unreal Tournament, be sure to set "Min Desired Framerate" to zero. Otherwise, the engine might auto-adjust some settings if the framerate drops too low.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 124 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2022-03-27, 12:52:

Heh, go for it. It might also be worthwhile to put up a spreadsheet with a graph which summarizes your test results. Maybe on Google Docs, or just as an attachment.

I was thinking of creating graphs at some point, however, I must admit that while I do love to benchmark (it's something that I enjoy even more than actually playing the games), creating graphs/spreadsheets/powerpoint slides makes me want to kill myself. 🤣

Joseph_Joestar wrote on 2022-03-27, 12:52:

BTW, when benchmarking Unreal Gold and Unreal Tournament, be sure to set "Min Desired Framerate" to zero. Otherwise, the engine might auto-adjust some settings if the framerate drops too low.

I would hope that the "min desired framerate" is only taken into account for actual gameplay and is ignored for timedemo benchmarks, but... you are right, it's probably better to ensure that it's 0.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 125 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

New video on the same K6-3+ platform using an Asus V7100Pro GeForce 2 MX400: https://www.youtube.com/watch?v=w5x79oP_vks

TEST SYSTEM:
MB: Asus P5A rev 1.04
CPU: AMD K6-2+ 570 MHz modded to AMD K6-3+ 570 MHz OC @ 633 MHz / FSB105 & 6.0 multi (2.1 Volts)
RAM: 128 MB SAMSUNG SDRAM PC133
VIDEO: Asus V7100Pro GeForce 2 MX400 32 MB
SOUND: Creative Sound Blaster Live 5.1 SB0220 (disabled for these tests)
HDD: Seagate 40 GB IDE/PATA

DRIVERS:
Chipset drivers: ALi Integrated Driver 2.13
nVIDIA GeForce 2 MX400 drivers: 7.76

As expected, the GeForce 2 MX400 performs similarly to the GeForce 2 Ultra at lower resolutions (when using the same driver). On fast systems, once we start increasing the resolution (and color depth), though, the MX's crippled memory bandwidth (and depending on the game, to a lesser extent, the much lower fill rate), will start to quickly affect its performance. There are certain games, though, where even on this fast SS7 platform, we're still CPU bottlenecked at higher resolutions.

This Asus GeForce 2 MX400 in particular has a bit more memory bandwidth than others (the memory clock runs at 200 MHz), but you would see identical performance at lower resolutions with a GeForce 2 MX (175 MHz core / 166 MHz memory clock), and up to 10% less performance at higher resolutions (depending on the game).
Whatever GeForce 2 MX you're using, just make sure that the video card has the full 128 bit memory bus!

So, IMO, the GeForce MX / MX400 are probably the 'best bang for buck' options for SS7 (cheap, fast, compatible, with very low power requirements).

As always with SS7 and AGP cards, you might need to do a bit of fine tuning to get the platform stable when using a GeForce (which is typically not the case with Voodoo 3 AGP cards, they tend to be very stable from the get-go because they don't use any AGP specific features, instead using the AGP slot as a fast PCI).

Unfortunately, the Voodoo 3 has more than 3 times the power consumption of a GeForce 2 MX, and this can be a problem for some cheaper SS7 motherboards, and in that specific case, the GeForce 2 MX remains the best choice.

bloodem wrote on 2022-03-27, 14:14:

I would hope that the "min desired framerate" is only taken into account for actual gameplay and is ignored for timedemo benchmarks, but... you are right, it's probably better to ensure that it's 0.

Well, my hopes were shattered. As it can be seen in the video, I tested both, and there is a difference. From now on I'm using "0" in UT as well. Thanks for the heads-up, Joseph_Joestar 😀

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 126 of 200, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2022-03-28, 16:37:

New video on the same K6-3+ platform using an Asus V7100Pro GeForce 2 MX400: https://www.youtube.com/watch?v=w5x79oP_vks

Out of curiosity, was it your intention to benchmark OpenGL performance in 16-bit color?

In the Nvidia driver settings, you have "Default color depth for textures" set to "Use desktop color depth" and your desktop resolution was 1024x768 at 16-bit colors. You can see that Quake 2 uses 16-bit rendering if you look the console window after changing the resolution in-game.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 127 of 200, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
bloodem wrote on 2022-03-28, 16:37:
New video on the same K6-3+ platform using an Asus V7100Pro GeForce 2 MX400: https://www.youtube.com/watch?v=w5x79oP_vks […]
Show full quote

New video on the same K6-3+ platform using an Asus V7100Pro GeForce 2 MX400: https://www.youtube.com/watch?v=w5x79oP_vks

TEST SYSTEM:
MB: Asus P5A rev 1.04
CPU: AMD K6-2+ 570 MHz modded to AMD K6-3+ 570 MHz OC @ 633 MHz / FSB105 & 6.0 multi (2.1 Volts)
RAM: 128 MB SAMSUNG SDRAM PC133
VIDEO: Asus V7100Pro GeForce 2 MX400 32 MB
SOUND: Creative Sound Blaster Live 5.1 SB0220 (disabled for these tests)
HDD: Seagate 40 GB IDE/PATA

DRIVERS:
Chipset drivers: ALi Integrated Driver 2.13
nVIDIA GeForce 2 MX400 drivers: 7.76

As expected, the GeForce 2 MX400 performs similarly to the GeForce 2 Ultra at lower resolutions (when using the same driver). On fast systems, once we start increasing the resolution (and color depth), though, the MX's crippled memory bandwidth (and depending on the game, to a lesser extent, the much lower fill rate), will start to quickly affect its performance. There are certain games, though, where even on this fast SS7 platform, we're still CPU bottlenecked at higher resolutions.

This Asus GeForce 2 MX400 in particular has a bit more memory bandwidth than others (the memory clock runs at 200 MHz), but you would see identical performance at lower resolutions with a GeForce 2 MX (175 MHz core / 166 MHz memory clock), and up to 10% less performance at higher resolutions (depending on the game).
Whatever GeForce 2 MX you're using, just make sure that the video card has the full 128 bit memory bus!

So, IMO, the GeForce MX / MX400 are probably the 'best bang for buck' options for SS7 (cheap, fast, compatible, with very low power requirements).

As always with SS7 and AGP cards, you might need to do a bit of fine tuning to get the platform stable when using a GeForce (which is typically not the case with Voodoo 3 AGP cards, they tend to be very stable from the get-go because they don't use any AGP specific features, instead using the AGP slot as a fast PCI).

Unfortunately, the Voodoo 3 has more than 3 times the power consumption of a GeForce 2 MX, and this can be a problem for some cheaper SS7 motherboards, and in that specific case, the GeForce 2 MX remains the best choice.

bloodem wrote on 2022-03-27, 14:14:

I would hope that the "min desired framerate" is only taken into account for actual gameplay and is ignored for timedemo benchmarks, but... you are right, it's probably better to ensure that it's 0.

Well, my hopes were shattered. As it can be seen in the video, I tested both, and there is a difference. From now on I'm using "0" in UT as well. Thanks for the heads-up, Joseph_Joestar 😀

Subbed so that I can watch later. This stuff fascinates me.

Reply 128 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2022-03-28, 17:28:

Out of curiosity, was it your intention to benchmark OpenGL performance in 16-bit color?

In the Nvidia driver settings, you have "Default color depth for textures" set to "Use desktop color depth" and your desktop resolution was 1024x768 at 16-bit colors. You can see that Quake 2 uses 16-bit rendering if you look the console window after changing the resolution in-game.

Yes, it's intentional (or it was initially). Since this whole saga was meant to be just a series of CPU-centered benchmarks, I wanted to eliminate as much as possible any potential GPU bottleneck, which is why I went with a 16-bit desktop color depth as default (which would mean that Quake 2 would also run at 16-bit). However, as you saw, in some instances, whenever I could/whenever it made sense, I also chose 32-bit colors. And because I started this way, I had to continue using it for all platforms.
Once I finish with the SS7/VIA C3 stuff and if I decide to continue with other (faster) GPUs, I will definitely make 32-bit the default. 😀

Jasin Natael wrote on 2022-03-28, 17:43:

Subbed so that I can watch later. This stuff fascinates me.

Welcome on board! I imagine that it fascinates all of us, we're a pretty weird bunch of individuals. 🤣

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 129 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

New video on the same K6-3+ platform using a Palit GeForce 4 MX440 AGP8X: https://www.youtube.com/watch?v=hClKvArkAgc

TEST SYSTEM:
MB: Asus P5A rev 1.04
CPU: AMD K6-2+ 570 MHz modded to AMD K6-3+ 570 MHz OC @ 633 MHz / FSB105 & 6.0 multi (2.1 Volts)
RAM: 128 MB SAMSUNG SDRAM PC133
VIDEO: Palit GeForce 4 MX440 AGP8X (running @ AGP 1X)
SOUND: Creative Sound Blaster Live 5.1 SB0220 (disabled for these tests)
HDD: Seagate 40 GB IDE/PATA

DRIVERS:
Chipset drivers: ALi Integrated Driver 2.13
nVIDIA GeForce 4 MX440 AGP8X drivers: 40.72

It's difficult to draw a conclusion regarding the GeForce 4 MX 440 and its usefulness on SS7. As the video shows, at lower resolutions the card is slower than the Voodoo 3/GeForce 2 series, however, even so, there are OpenGL games like GLQuake / Quake 2, MDK2 where frame rates are still high and, most importantly, they remain high even at 1600 x 1200 with 32-bit color depth.

That being said, there are also cases like that of Unreal, where there's something funky happening when enabling 32-bit colors, and performance immediately tanks.
I'm not entirely sure what the cause is, it's something that requires further investigation, but it's probably a bug that occurs on this specific SS7 platform when using newer nVIDIA drivers.
After doing the video, I also tested driver versions 45.23 and 30.82 (on the latter I had to force the installation for the GeForce 4 MX440 AGP8X), and both drivers exhibited the same problem. I also re-enabled AGP-2X and, as expected, it didn't make a difference.

What I do know for sure is that the same video card (with the same drivers) works perfectly fine on other platforms based on Intel 440BX/815/845/865, VIA KT266A/KT400/KT600/KT880, etc. So whatever it is, it's somehow linked to this SS7 platform (maybe others as well?).
Also, there might be other games affected by this type of bug, since, starting with the year 2002, nVIDIA clearly stopped caring about SS7.

So, my conclusion is that, unless the GeForce 4 MX440 is all you have, the GeForce 2 MX or the Voodoo 3 remain the best choice for SS7 (even more so if you have a slower SS7 system than the one I benchmarked).

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 130 of 200, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2022-03-29, 05:15:

That being said, there are also cases like that of Unreal, where there's something funky happening when enabling 32-bit colors, and performance immediately tanks.

Sometimes, problems can occur if the desktop resolution and color depth are lower than what's being used for rendering by the game's engine. I doubt that's the issue here, but it might be worth checking. When I do my benchmarks, I always match my desktop resolution and color depth to whatever the game is using, just in case.

So, my conclusion is that, unless the GeForce 4 MX440 is all you have, the GeForce 2 MX or the Voodoo 3 remain the best choice for SS7 (even more so if you have a slower SS7 system than the one I benchmarked).

Excellent analysis! It's also interesting how little difference there is between an MX400 and a full GeForce2 Ti on such a system. As you say, unless people intend to game in resolutions higher than 1024x768 on a SS7 build, it makes little sense to go with a more powerful card. Especially with the performance loss that comes from using newer drivers which Nvidia's AGP 8x cards require.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 131 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2022-03-29, 07:00:

Sometimes, problems can occur if the desktop resolution and color depth are lower than what's being used for rendering by the game's engine. I doubt that's the issue here, but it might be worth checking. When I do my benchmarks, I always match my desktop resolution and color depth to whatever the game is using, just in case.

Good thinking. That might explain why I haven't seen this behavior on other (faster) platforms, since on those platforms I always keep the desktop color depth at 32bit.
In the near future will also test a TNT2 Pro, and after I'm done with that, will definitely retest the GeForce 4 MX440 with a 32 bit desktop color depth and see if it makes a difference in Unreal.

Joseph_Joestar wrote on 2022-03-29, 07:00:

Excellent analysis! It's also interesting how little difference there is between an MX400 and a full GeForce2 Ti on such a system. As you say, unless people intend to game in resolutions higher than 1024x768 on a SS7 build, it makes little sense to go with a more powerful card. Especially with the performance loss that comes from using newer drivers which Nvidia's AGP 8x cards require.

Yeah, with its 4 Watts of power consumption and very low price, it's very hard not to recommend a GeForce 2 MX for this type of system.
As mentioned, the only suggestion I have for anyone who will buy it for SS7, is to set the AGP transfer speed to 1X from the get-go. This will help avoid all/most stability issues (it has certainly worked for me on all the SS7 boards I've ever tested).

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 132 of 200, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
bloodem wrote on 2022-03-29, 08:14:
Good thinking. That might explain why I haven't seen this behavior on other (faster) platforms, since on those platforms I alway […]
Show full quote
Joseph_Joestar wrote on 2022-03-29, 07:00:

Sometimes, problems can occur if the desktop resolution and color depth are lower than what's being used for rendering by the game's engine. I doubt that's the issue here, but it might be worth checking. When I do my benchmarks, I always match my desktop resolution and color depth to whatever the game is using, just in case.

Good thinking. That might explain why I haven't seen this behavior on other (faster) platforms, since on those platforms I always keep the desktop color depth at 32bit.
In the near future will also test a TNT2 Pro, and after I'm done with that, will definitely retest the GeForce 4 MX440 with a 32 bit desktop color depth and see if it makes a difference in Unreal.

Joseph_Joestar wrote on 2022-03-29, 07:00:

Excellent analysis! It's also interesting how little difference there is between an MX400 and a full GeForce2 Ti on such a system. As you say, unless people intend to game in resolutions higher than 1024x768 on a SS7 build, it makes little sense to go with a more powerful card. Especially with the performance loss that comes from using newer drivers which Nvidia's AGP 8x cards require.

Yeah, with its 4 Watts of power consumption and very low price, it's very hard not to recommend a GeForce 2 MX for this type of system.
As mentioned, the only suggestion I have for anyone who will buy it for SS7, is to set the AGP transfer speed to 1X from the get-go. This will help avoid all/most stability issues (it has certainly worked for me on all the SS7 boards I've ever tested).

It's really something when one can specifically recommend a GF2 MX 400 over a GF4 MX 440 in any certain situation xD 😋

Btw, maybe I missed it but how did you measure the power usage of the graphics cards? Did you measure from the wall outlet?

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 133 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Tetrium wrote on 2022-03-29, 08:48:

Btw, maybe I missed it but how did you measure the power usage of the graphics cards? Did you measure from the wall outlet?

I did many power measurements a few years ago. And yes, even though not 100% accurate, I tested at the wall outlet.
My results were mostly in line with this GPU power consumption chart.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 134 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

New video (Revisiting the VIA C3 Ezra-T): https://www.youtube.com/watch?v=cEOjvsNI8pE

TEST SYSTEM:
MB: Gigabyte GA-6BXC rev 1.9 (VRM mod)
CPU: VIA Ezra-T 1 GHz (OC @ 1.26 GHz / FSB133 & 9.5 multi)
RAM: 3 x 128 MB SAMSUNG SDRAM PC133 (only 192 MB are detected by the motherboard, since these are newer memory modules with 4 x high density chips/module)
VIDEO: Asus V7700Ti GeForce 2 Ti (OC @ GeForce 2 Ultra clocks)
SOUND: Creative Sound Blaster Live 5.1 SB0220 (disabled)
HDD: Seagate 40 GB IDE/PATA

This video is a follow-up to the original video I did some time ago, where I benchmarked the VIA C3 Ezra-T.

This time I've only tested without sound (so that this platform's performance can be directly compared to that of the SS7 platform I benchmarked in recent videos). If you also want to see what the performance is like with sound, make sure to check the initial video: https://www.youtube.com/watch?v=uzXLPlmCMJ8

Starting at 27:38, I've also performed some benchmarks at default speed, without an overclock.

So, to finally settle the VIA C3 performance debate, this is my final conclusion on the matter: when running on a 440BX motherboard, and when overclocked @ 1.26 GHz / FSB133, the VIA C3 Ezra-T is 10 to 25% faster in games, when compared to a very fast and highly tuned SS7 platform (no tuning required on the 440BX).
Without overclocking (@ 1 GHz / FSB100), it generally has about the same performance as that of a very fast + tuned SS7 platform.

As I'll show in some future videos, (which will be focused on MS-DOS gaming), the speed flexibility that this CPU offers (especially in conjunction with a Gigabyte GA-6BXC board) is also simply amazing!
And, obviously, you will also benefit from all the other advantages that come with the Intel 440BX chipset (excellent compatibility, perfect stability, everything will just work without a hitch both in Windows and DOS).

And, for closure, I'm also adding two graphs (please note that, off camera, I had to redo the UT benchmarks for the K6-3+ & GeForce 2 Ultra, this time using a min desired framerate of "0" ).

Attachments

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 135 of 200, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
bloodem wrote on 2022-03-30, 05:02:

RAM: 3 x 128 MB SAMSUNG SDRAM PC133 (only 192 MB are detected by the motherboard, since these are newer memory modules with 4 x high density chips/module)

I hate dealing with memory density issues. Had a similar problem on my Pentium MMX 430TX system where a 128 MB SDRAM stick would register as 64 MB.

After getting this odd looking 64 MB SDRAM stick, which apparently has proper density, the amount was finally detected correctly. Furthermore, it provided a very slight boost in performance (1-2%). Possibly something to do with memory interleaving not working quite right with high density sticks on such an old motherboard.

Without overclocking (@ 1 GHz / FSB100), it generally has about the same performance as that of a very fast + tuned SS7 platform.

Very nice! BTW, do you think that the GPU running at AGP 2x on the 440BX chipset compared to AGP 1x on SS7 (for better stability) makes any difference in terms of performance?

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 136 of 200, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2022-03-30, 06:35:

I hate dealing with memory density issues. Had a similar problem on my Pentium MMX 430TX system where a 128 MB SDRAM stick would register as 64 MB.

After getting this odd looking 64 MB SDRAM stick, which apparently has proper density, the amount was finally detected correctly. Furthermore, it provided a very slight boost in performance (1-2%). Possibly something to do with memory interleaving not working quite right with high density sticks on such an old motherboard.

Yes, this is pretty standard behavior with these older boards when using more "modern" PC133 high density RAM. There are even worse situations, with some cheaper boards that simply refuse to post with this type of RAM.
Anyway, I mentioned it in a previous post, I also tested with Kingston PC133 (8 chips / module) and it made no difference in terms of performance.

Joseph_Joestar wrote on 2022-03-30, 06:35:

Very nice! BTW, do you think that the GPU running at AGP 2x on the 440BX chipset compared to AGP 1x on SS7 (for better stability) makes any difference in terms of performance?

Absolutely no difference between the two (especially with this type of video card).
Now that you mentioned it, I remember that, at one point, I even tested a GeForce 4 Ti 4200 on a KT400 motherboard with an Athlon XP CPU, and that specific board had an older BIOS version which had a bug, and most video cards were limited to AGP 1X. The funny part is that I didn't even notice this until opening Aida64, because performance was exactly where it should have been even at AGP 1X. 😀

Either way, on 440BX, just like on SS7, we're heavily CPU bottlenecked (+ none of these older GPUs would be able to saturate the AGP1X bandwidth, not even at higher resolutions).
Furthermore, AGP bandwidth should not really be used extensively if all textures fit in the VRAM (which, let's face it, is what you always want). Even with these older/slower video cards, their on-board memory bandwidth is much higher than the AGP1X/2X bandwidth.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 137 of 200, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Joseph_Joestar wrote on 2022-03-30, 06:35:
I hate dealing with memory density issues. Had a similar problem on my Pentium MMX 430TX system where a 128 MB SDRAM stick would […]
Show full quote
bloodem wrote on 2022-03-30, 05:02:

RAM: 3 x 128 MB SAMSUNG SDRAM PC133 (only 192 MB are detected by the motherboard, since these are newer memory modules with 4 x high density chips/module)

I hate dealing with memory density issues. Had a similar problem on my Pentium MMX 430TX system where a 128 MB SDRAM stick would register as 64 MB.

After getting this odd looking 64 MB SDRAM stick, which apparently has proper density, the amount was finally detected correctly. Furthermore, it provided a very slight boost in performance (1-2%). Possibly something to do with memory interleaving not working quite right with high density sticks on such an old motherboard.

Without overclocking (@ 1 GHz / FSB100), it generally has about the same performance as that of a very fast + tuned SS7 platform.

Very nice! BTW, do you think that the GPU running at AGP 2x on the 440BX chipset compared to AGP 1x on SS7 (for better stability) makes any difference in terms of performance?

Interesting DIMM. So which board will this module work correctly with?
I remember having one such modules myself and always wondered why it even existed 😋
Can't remember the size but I thought mine was 128MB? It was relatively small amount of RAM though.

Seems like mostly a curiosity, but if it has any practical purposes, I'd like to learn about it 🙂

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 138 of 200, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Tetrium wrote on 2022-03-30, 08:53:

Interesting DIMM. So which board will this module work correctly with?

I would assume any motherboard which needs low density SDRAM.

My earlier comment was aimed at the standard 128 MB high density PC133 SDRAM stick which I had used prior to getting this one.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 139 of 200, by matze79

User metadata
Rank l33t
Rank
l33t

I'm guessing you never tested a VIA C3. That's actually the idea: you DON'T need to disable the L1 cache to hit a gazillion speed points (especially since there are games which enable the L1 cache at startup). You can selectively disable the branch prediction, instruction cache, decrease the multiplier down to 3x and more importantly, on a motherboard like the GA-6BXC, you can also drop the FSB down to 50 MHz using software. I have more than 20 SS7 boards, none of them can do this (their FSB is adjusted either through DIP switches or jumpers). Maybe there are some that might at least offer a SoftFSB BIOS menu, but in 20 years I never came across one of these (so if there are, they are ultra-rare).

The VIA C3 dosnt work reliable for me at multi 3.0 below 100Mhz FSB.
I also have to set it step by step, going instant to 3.0 ends up with freeze.
66*3.0 does not work for me, also freeze.
Just 100/133 FSB works pain free for me on Samuel 2.

So the slowest i can achieve is 300Mhz.

https://www.retrokits.de - blog, retro projects, hdd clicker, diy soundcards etc
https://www.retroianer.de - german retro computer board