VOGONS


First post, by paradigital

User metadata
Rank Oldbie
Rank
Oldbie

So, after the saga that was getting my Slot A build stable (when caps go bad), I started trying a few video cards in the machine to get a feel for what felt best/provided the best performance whilst not being too far out in terms of period. To that end I wanted to try the following:

TNT2 Ultra (Diamond Viper V770U)
Geforce 256 DDR (Creative Labs Annihilator Pro)
Geforce 2 MX (Dell OEM, not M64, 200 or 400)
Geforce 2 GTS Pro (Dell OEM)
3Dfx Voodoo 3 3000

I started with the TNT2 Ultra, which turned in a 3DMark99 Max score of 6,843.

Before diving too deep into testing a suite of games I wanted to just compare 3DM99 to whittle out the slowest, so removed all video drivers, popped the MX in and installed drivers, rebooted and ran 3DM99.

The MX turned in 5,551. Seems a bit… low.

The CPU scores between the two cards are similar enough that I think I can rule out the influence of the graphics card on the mainboard/RAM/CPU speed, with the MX scoring 14,710 and the TNT2 14,795.

The MX then proceeds to spank the TNT2 in fill rate, texturing, rasterizing, bump mapping, indeed almost every synthetic benchmark was a clear win for the MX, anywhere from 20% increase (e.g. single-pass bump mapping) to 2000% increase in single point lighting.

However, in the trilinear texturing test and the two “game” tests, the TNT2 pulls ahead. A 4% win for the TNT2 in trilinear texturing, a 15% lead in “game 1” and a 23% lead in “game 2” for the TNT2.

Something feels amiss here, the TNT2 shouldn’t be pulling off a defeat of the MX, right?

Reply 1 of 25, by Disruptor

User metadata
Rank Oldbie
Rank
Oldbie

No wonder.
As far as no T&L is used, the TNT2 Ultra clearly should beat the GeForce 2 MX.

I mean, you compare a king of the hill from a not so far former generation with a low end graphics card!

Reply 2 of 25, by paradigital

User metadata
Rank Oldbie
Rank
Oldbie

Oh don’t get me wrong, I wasn’t expecting the MX to be leagues ahead, but I wasn’t expecting it to be killed either.

I was never an MX owner back in the day, I went straight from a TNT2 Ultra to a GTS, but all the modern reviews I can find suggest that an MX or MX 400 is a good stand-in for a Geforce 256, which doesn’t get beaten by a TNT2 of any form usually.

I still feel something is amiss, the synthetic figures would suggest the MX should walk the game tests, but it doesn’t.

Reply 3 of 25, by mwdmeyer

User metadata
Rank Oldbie
Rank
Oldbie

Other than a small amount of memory bandwidth the Geforce 2 MX should be a lot quicker. In higher resolutions e.g 1024x768 I can see the TNT2 keeping up/doing better (in non T&L games), but otherwise I'm not sure why.

It would be good to test some other games.

Last edited by mwdmeyer on 2023-05-29, 11:01. Edited 1 time in total.

Vogons Wiki - http://vogonswiki.com

Reply 4 of 25, by Dorunkāku

User metadata
Rank Newbie
Rank
Newbie

Test 'Expendable' if you want the TNT2 Ultra to look good. Test Quake III Arena if you want it to look bad.
3DMark99 runs at 640x480 in 16 bits color by default, that is not a resolution or color depth people used to game in at that time.

Reply 5 of 25, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t

Could be something simple like the GeForce 2 MX drivers having V-Sync on by default, while the TNT2 Ultra drivers had it off.

But yeah, as others have said, try some games instead of just 3D Mark. Specifically, Quake 3 in higher resolutions like 1024x768 and above, using 32-bit color depth, should show better performance on the GeForce 2 MX.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 6 of 25, by paradigital

User metadata
Rank Oldbie
Rank
Oldbie
Dorunkāku wrote on 2023-05-29, 10:51:

Test 'Expendable' if you want the TNT2 Ultra to look good. Test Quake III Arena if you want it to look bad.
3DMark99 runs at 640x480 in 16 bits color by default, that is not a resolution or color depth people used to game in at that time.

My 3DMark99 runs default at 800x600x16, pretty sure thats not just my install.

16-bit was definitely standard affair at the time, but I tended to play at 1024x768 where I could.

Reply 9 of 25, by paradigital

User metadata
Rank Oldbie
Rank
Oldbie
Garrett W wrote on 2023-05-29, 13:28:

You haven't mentioned which drivers you used for each card. Also, are you sure the Dell OEM GF2 MX isn't using a narrower memory bus, say 64bit?

40.71 for both.

I don’t believe the Dell card is cut down, just plain jane in appearance.

Reply 10 of 25, by NostalgicAslinger

User metadata
Rank Member
Rank
Member

Expendable has the "Hardware Texture Compression" setting in the graphic settings, so a GeForce2 MX should run and look better than a TNT2 Ultra in this game, also all Quake 3 based engine games. The Detonator 6.31 is a good driver for the GeForce 2 series, also with less CPU overhead.

The 1999 GeForce 256 SDR NV10 was the fist nvidia Chip with hardware texture compression support.

Reply 11 of 25, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

However, in the trilinear texturing test and the two “game” tests, the TNT2 pulls ahead. A 4% win for the TNT2 in trilinear texturing, a 15% lead in “game 1” and a 23% lead in “game 2” for the TNT2.

TNT2 can't do trilinear filtering. Well, it technically can, but you'll have to drop fillrate by half or multitexturing, so Nvidia disabled that feature, just like 3dfx did. TNT family can do trilinear approximation instead, which is also called mipmap dithering. It's faster, but uglier. GeForce 2 MX can do proper trilinear filtering and will suffer considerable penalty for that.

Here's how it looks on Voodoo 5 (courtesy of ixbt.com):

v5_q3_tlf.jpg
Filename
v5_q3_tlf.jpg
File size
97.81 KiB
Views
1639 views
File license
Public domain

TNT has similar dithering pattern.

Last edited by The Serpent Rider on 2023-05-29, 17:29. Edited 1 time in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 12 of 25, by Gmlb256

User metadata
Rank l33t
Rank
l33t

Note that on Quake III, there will be color banding issues (particularly the sky) when texture compression is enabled on early GeForce cards. Forcing DXT3 format with RivaTuner can mitigate it with some performance hit.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 13 of 25, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
paradigital wrote on 2023-05-29, 16:09:
Garrett W wrote on 2023-05-29, 13:28:

You haven't mentioned which drivers you used for each card. Also, are you sure the Dell OEM GF2 MX isn't using a narrower memory bus, say 64bit?

40.71 for both.

I don’t believe the Dell card is cut down, just plain jane in appearance.

does it have 8 memory chips or just 4?

Reply 15 of 25, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
SPBHM wrote on 2023-05-29, 19:29:

does it have 8 memory chips or just 4?

Four SDRAM chips can also do 128-bit bus.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 16 of 25, by imi

User metadata
Rank l33t
Rank
l33t

I upgraded from a TNT 2 ultra to a GF2 MX 200 MX 400 back in the day and it definitely was a considerable upgrade... I remember how happy I was that a cheap card like that gave me a boost.

Last edited by imi on 2023-05-30, 10:41. Edited 1 time in total.

Reply 17 of 25, by mwdmeyer

User metadata
Rank Oldbie
Rank
Oldbie

Oh no the MX 200 would not be an upgraded at all, a lot less memory bandwidth, only case it would be if you had a very slow CPU and low res.

Vogons Wiki - http://vogonswiki.com

Reply 18 of 25, by leileilol

User metadata
Rank l33t++
Rank
l33t++
Gmlb256 wrote on 2023-05-29, 17:28:

Note that on Quake III, there will be color banding issues (particularly the sky) when texture compression is enabled on early GeForce cards.

IIRC that was fixed around detonator 12.41. Quake3 tells OpenGL to load RGB textures in a S3TC format and nVidia's on-demand compression wasn't great. Note that Q3 lacks compressed textures to begin with so it's a quality loss to enable texture compression anyway (it's default on for Savages in mind with the hopes that no other vendor does it).

This, of course, has fed a few "3dFx Rulez!!!! VooDoo III > GForce 2!!! FUCK 32BIT SCAM" on the internet, where arguments for not supporting a feature equals better hardware.

apsosig.png
long live PCem

Reply 19 of 25, by imi

User metadata
Rank l33t
Rank
l33t
mwdmeyer wrote on 2023-05-30, 03:23:

Oh no the MX 200 would not be an upgraded at all, a lot less memory bandwidth, only case it would be if you had a very slow CPU and low res.

maybe I'm misremembering things, but I thought it was a MX200, it was an ASUS V7100 and apparently that doesn't say much as there's been different versions.

...ok I looked into it more, apparently it was an "Asus V7100PRO/64M", so MX 400, sorry about that ^^