VOGONS


First post, by kahuna

User metadata
Rank Newbie
Rank
Newbie

Hello everyone!

I need some help to understand what the title of this post says, whether these numbers are expected or if am I missing something here.
Let me describe my current system:
SOYO SY-5EMA PRO (ETEQ chipset which is just a rebranded VIA MVP3)
AMD K6-III+ @616 MHz (112 x 5.5)
384MB of RAM CL2
Windows 98 SE
Sound Blaster Live!
Graphics cards tested (all AGP): Voodoo 3 3000, Geforce 2 Ti, Geforce 3 Ti200, Geforce FX5900

I'll get more specific numbers and do more comprehensive tests if I can, but on a quick test all the Nvidia cards scored around 2000 marks on 3DMARK 2000 with the default settings for the test.
While in Unreal Tournament I pretty much got 20-22 fps average depending on the graphics card used, all of that at 1024x768x16 with high details and running the utbench demo using Direct3D.
The Voodoo 3 performance was slightly better with Glide for what I recall (I will come back to this and post the actual numbers).

What worries me is it seems I'm pretty much CPU bound, as there is no difference on the results I'm getting with different graphics cards. So, is that a correct assessment or am I missing something here?
Is there something I can do to improve the results?

I have tried with different Nvidia driver versions, I tested the same utbench demo with multiple driver versions and gets better if I use older drivers but still not a big difference (1-2 fps up/down)
Also, I tested with different AGP drivers via 4in1 and ETEQ releases.

Any insights and suggestions are more than welcomed.
Thanks!

Last edited by kahuna on 2023-01-05, 22:24. Edited 1 time in total.

Be free!

Reply 1 of 6, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie

UTBench is a CPU intensive deathmatch demo on UT, one of the more CPU intensive period correct games for K6-2/K6-III.

Try some other benchmarks to see whether or not your results lineup with other people's systems on here. For all intents and purposes, Voodoo 3 is what you want on that system and it will usually perform better than other cards provided you don't go above 1024x768 (or 800x600 for later 1999/2000 games). GeForce cards have drivers with higher CPU overhead, even earlier ones, but can actually outperform 3Dfx cards even on such a slow CPU, when their HW T&L is used which removes a lot of overhead from said CPU. Realistically, if you're running a game that requires or greatly benefits from T&L, your K6-III is probably not up to the task anyway, expect low framerates either way.

Reply 2 of 6, by Repo Man11

User metadata
Rank Oldbie
Rank
Oldbie

With everything optimized you should be scoring between 4,000 to 4,500 points on 3D Mark 2000 with that system.

"I'd rather be rich than stupid" - Jack Handey

Reply 3 of 6, by Disruptor

User metadata
Rank Oldbie
Rank
Oldbie
kahuna wrote on 2023-01-05, 21:04:
... I'll get more specific numbers and do more comprehensive tests if I can, but on a quick test all the Nvidia cards scored aro […]
Show full quote

...
I'll get more specific numbers and do more comprehensive tests if I can, but on a quick test all the Nvidia cards scored around 2000 marks on 3DMARK 2000 with the default settings for the test.
While in Unreal Tournament I pretty much got 20-22 fps average depending on the graphics card used, all of that at 1024x768x16 with high details and running the utbench demo using Direct3D.
The Voodoo 3 performance was slightly better with Glide for what I recall (I will come back to this and post the actual numbers).

What worries me is it seems I'm pretty much CPU bound, as there is no difference on the results I'm getting with different graphics cards. So, is that a correct assessment or am I missing something here?
Is there something I can do to improve the results?

I have tried with different Nvidia driver versions, I tested the same utbench demo with multiple driver versions and gets better if I use older drivers but still not a big difference (1-2 fps up/down)
Also, I tested with different AGP drivers via 4in1 and ETEQ releases.

Any insights and suggestions are more than welcomed.
Thanks!

Have you reviewed your BIOS settings?
DRAM speed, and other timing relevant settings?

Reply 4 of 6, by kahuna

User metadata
Rank Newbie
Rank
Newbie

Thanks for the quick replies, much appreciated!

RepoMan, could you please let me know what mobo drivers, DirectX, Nvidia drivers etc. are you using? Happy to try and try again 😉

In regards to the BIOS settings, I'll post later what I have configured.

Be free!

Reply 5 of 6, by Repo Man11

User metadata
Rank Oldbie
Rank
Oldbie

Because of the CPU limitation the GeForce 3 will score little (if any) better than the GF2. You want to use the earliest driver Nvidia driver for the card, 7.76. Drop the memory to 256 maximum. You're best off with Direct X 7. The MVP3 does allow for memory bank interleaving which helps improve memory performance, but there's no option in the CMOS settings for that - but the George Breese patch works fine. Re: K6-III+ Build having trouble determining baseline performance

"I'd rather be rich than stupid" - Jack Handey

Reply 6 of 6, by Disruptor

User metadata
Rank Oldbie
Rank
Oldbie

Uh, yes, some of the memory is not covered by the onboard cache (L3 in case of a K6-2+ or K6-3).
I'm not sure how this influences your performance, but you may give it a try.
In my K6-2+ system I just use 128 MB of RAM. It is a MVP3 too and has 512 KB cache.