VOGONS


Geforce 2 MX vs MX200 vs MX400

Topic actions

First post, by Totempole

User metadata
Rank Member
Rank
Member

Hi everyone,

Until recently, I had incorrectly assumed that NVidia's detonator drivers only supported TNT cards. Since real TNT2's are quite scarce, I eventually settled on the TNT2 M64 for my Slot 1 P3 450MHz machine. Recently, upon further inspection, I realised the drivers I was using also supported all Geforce cards up to the Geforce 2 GTS. I remembered buying a Geforce 2 MX card some time back, so I gave it a go. I did have to reinstall the drivers, but the same driver pack worked fine. The best part was that 3D performance was 2.5x faster than with the TNT2 M64.

Something that boggles my mind about this card (Geforce 2 MX) is the fact that it has no heatsink, but it's faster than an Geforce 2 MX-200. It doesn't seem to get hot even without a heatsink whereas with the TNT2 M64, you could probably fry an egg on the heatsink.

It's quite confusing, because one would have thought the MX-200 a boost over the MX, and most MX-200's have heatsinks as well. I noticed in benchmarks that the MX is less than 10% slower than the MX-400, which is quite surprising considering that the MX isn't even passively cooled.

So my question is, am I missing something?
Is there a reason why the MX doesn't need a heatsink but the MX-200 does?

My Retro Gaming PC:
Pentium III 450MHz Katmai Slot 1
Transcend 256MB PC133
Gigabyte GA-6BXC
MSI Geforce 2 MX400 AGP
Ensoniq ES1371 PCI
Sound Blaster AWE64 ISA

Reply 1 of 41, by obobskivich

User metadata
Rank l33t
Rank
l33t

According to Wikipedia the MX is the same as the MX 200, with double the memory bandwidth (128-bit bus). Back in the day I know my Asus MX 200 did not have a heatsink (and it ran like that for YEARS), and I have an MX 400 that came with no heatsink (and ostensibly that's how it always was), but I added one just because I had one laying around (it made no difference either way though). Based on that, I assume GF2 MX doesn't require a heatsink. Doubt you'd hurt anything but adding one though (I just took one off of the southbridge of a dead motherboard, it had the right clips/spacing to fit the mounting holes on the card I have).

I'd also agree with the M64s being warm - I have both a PCI and AGP version, and both of them run fairly hot to the touch compared to both older and newer cards IME.

Reply 2 of 41, by Totempole

User metadata
Rank Member
Rank
Member

When I bought the card, I assumed he MX was the absolute entry level, and that the MX-200 was a step up. I'm glad I was wrong.

Do you think I would derive any significant performance benefit with a Geforce 2 GTS over a Geforce 2 MX in a Pentium 3 450?

I have a heatsink I could add to the Geforce 2MX, but I only have non-curing thermal paste at the moment. At any rate, I don't plan on over clocking, and the card must be like 13 years old already, so if heat was an issue, I'm pretty sure by now it would have been too late.

My Retro Gaming PC:
Pentium III 450MHz Katmai Slot 1
Transcend 256MB PC133
Gigabyte GA-6BXC
MSI Geforce 2 MX400 AGP
Ensoniq ES1371 PCI
Sound Blaster AWE64 ISA

Reply 3 of 41, by obobskivich

User metadata
Rank l33t
Rank
l33t
Totempole wrote:

When I bought the card, I assumed he MX was the absolute entry level, and that the MX-200 was a step up. I'm glad I was wrong.

I don't ever remember seeing the original MX; only MX 200 and 400. According to Wikipedia the 200 and 400 are meant to sit on "either side" of it in terms of performance. From the #s I'm doubting there's a very dramatic (in the grand scheme of things) difference between the three of them (they should all support the same hardware/software features, it's just a matter of clocks and memory configuration beyond that).

Do you think I would derive any significant performance benefit with a Geforce 2 GTS over a Geforce 2 MX in a Pentium 3 450?

Do you need such a performance increase? The P3 450 is pretty meagre for any of these cards - my MX 200 was original equipment with a P4 in 2001, just to give some frame of reference. A GTS or Ultra would be certainly faster, but would it matter for what you're doing? And for games that "need" that level of performance, is that P3 up to the task? The other thing to consider is that GTS and Ultra will have fans.

I'd personally probably skip over the GF2 GTS/Ultra and go for a GF4 MX (they have LMA among other things, which is an advantage) or a GF3/4 Ti (if you need pixel shaders also) if getting a new card. None of them (excepting the 4 Ti) should probably cost more than $10-$20 though, so it isn't like there's any big price "gotcha" in picking one or the other.

I have a heatsink I could add to the Geforce 2MX, but I only have non-curing thermal paste at the moment. At any rate, I don't plan on over clocking, and the card must be like 13 years old already, so if heat was an issue, I'm pretty sure by now it would have been too late.

Nothing wrong with leaving well enough alone. I'd probably keep an eye on it from time to time, make sure there's some airflow, and leave well enough be. 😀

Reply 4 of 41, by idspispopd

User metadata
Rank Oldbie
Rank
Oldbie

I was also confused about a GF2MX without a heat sink (not sure which variant that was exactly). Since I only got that one to have DVI output in a computer and the CPU wasn't really able to saturate the card when gaming I wasn't too worried.

Regarding the comparison with a TNT2 M64: The TNT2 M64 is basically the same chip as the TNT2, only the memory interface is narrower. The GF2MX is a halved GF2, and is uses a newer process (180nm) than the TNT2 M64 (250nm) so I can easily believe that is uses less power.

Regarding upgrading to a faster GPU: I suppose that would not be really worth it, though that depends on the resolution. You could try how the games you want to play behave when you increase the resolution, up to whatever resolution you might want to play at. The theoretical maximum speed of the GF2 is more than two times that of the GF2MX so your advantage would be to be able to use much higher resolutions at maximum speed.

Reply 5 of 41, by shamino

User metadata
Rank l33t
Rank
l33t

The Geforce2 MX cards were mainly limited by their RAM performance. People found it was easy to overclock the GPU but there was some debate over whether it was useful. High performance RAM configurations are expensive, so that's what gets crippled the most on cheaper tiers of video cards and generally is what holds them back.
Hercules sold a 2MX card with slightly faster SDRAM than standard (183MHz instead of 166MHz). Creative sold theirs with 64-bit DDR, and reviews showed it was a bit slower than standard (64 bit DDR is slower than 128 bit SDR at same clock speed, due to latency).

The retail cards typically had a GPU heatsink, some even had a fan. The fan was overkill. Whether the heatsink was needed or not, people would rather buy a card with a heatsink so that was reason enough for the retail cards to include one.
Eventually the GF2MX cards became common in Dells/HPs/etc, and those non-retail cards that I've seen don't have heatsinks. Maybe this shows the heatsink wasn't needed, but to be fair, I suppose these would also tend to be later cards which might have had a die shrink.

On a P3-450, the GTS (or anything with more memory bandwidth) should be faster at high resolutions (1600x1200 for sure), but as you drop to lower resolutions it will be increasingly CPU bound so there might not be as much difference. When I was researching a purchase for my K6-3 450 a long time ago, it appeared that the 2MX was pretty much as good as it could utilize at 1024x768. The P3 is faster in 3D but the effect is probably not much different. I've never actually done a scaling experiment on it though.

Reply 6 of 41, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah there's no reason I can think of to go with NV10-15. Other than nostalgia.

GeForce 4 / 4MX have advantages like much better anisotropic filtering, anti-aliasing, a texture compression fix (DXT1 dithering), typically better VGA signal quality, and vastly improved efficiency from second generation hidden surface removal tech and better memory controllers (GF3 was the first gen of this). You lose nothing from a compatibility standpoint AFAIK. Mau1wurf1977 likes GF4Ti best for his Splinter Cell addiction too!

Reply 7 of 41, by obobskivich

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

You lose nothing from a compatibility standpoint AFAIK.

Actually now that you mention it, I remember my 2 MX 200 refusing to work with ORB, but the 4 MX 440 that I eventually replaced it with had no problems. Go figure. 😕

Reply 8 of 41, by sliderider

User metadata
Rank l33t++
Rank
l33t++

Everything you ever wanted to know about nVidia graphics cards

http://en.wikipedia.org/wiki/List_of_Nvidia_g … rocessing_units

According to the chart, the GeForce 2MX is older than the MX200 and has more memory bandwith because it utilizes a 128-bit memory bus while the MX 200 use a 64-bit bus. It looks like the MX 200 was introduced as a reduced cost option to the original MX. Other than memory bandwith, all other specs are the same.

Reply 9 of 41, by AlphaWing

User metadata
Rank Oldbie
Rank
Oldbie

You can't use the older Detonator drivers on a Geforce 4 mx series.
You can with the Geforce2 Mx's and that can mean the difference between, game working or not in 9x.
Silver comes to mind, a FF7 clone. Doesn't like later forceware drivers at all, loves to display a grid pattern everywhere with later drivers.

Reply 10 of 41, by swaaye

User metadata
Rank l33t++
Rank
l33t++
AlphaWing wrote:

You can't use the older Detonator drivers on a Geforce 4 mx series.
You can with the Geforce2 Mx's and that can mean the difference between, game working or not in 9x.
Silver comes to mind, a FF7 clone. Doesn't like later forceware drivers at all, loves to display a grid pattern everywhere with later drivers.

This is more why one should have a Voodoo 3/4/5 card around. The problem is that even D3D games of that era were often mainly written for Voodoo cards.

Reply 11 of 41, by meljor

User metadata
Rank Oldbie
Rank
Oldbie

I agree. In my k6-3+ system i use a Geforce2Ti along with v2 sli.

A lot of games the geforce doesn`t do well in image quality compared to the voodoo`s (blocky smoke etc.). Voodoo cards are much better supported in older 3d games imho.

The geforce is nice for some early TnL games and is awesome for using AA on older direct3d games. This is the reason why i chose it over a MX version. AA is very taxing on the cards and Geforce2 (pro/gts/Ti/ultra) are much faster for this.

So yes, it might be overkill for a p3 450mhz system but if you want to use AA and higher resolutions it is very usable in such a system.

Same thing with the voodoo5 5500: it scales to the moon at lower res with fast cpu`s but if you use it for 1024x768 and 4xAA it becomes a bottleneck very quick and you don`t need anything faster than a p3 600mhz or so.

asus tx97-e, 233mmx, voodoo1, s3 virge ,sb16
asus p5a, k6-3+ @ 550mhz, voodoo2 12mb sli, gf2 gts, awe32
asus p3b-f, p3-700, voodoo3 3500TV agp, awe64
asus tusl2-c, p3-S 1,4ghz, voodoo5 5500, live!
asus a7n8x DL, barton cpu, 6800ultra, Voodoo3 pci, audigy1

Reply 12 of 41, by redigger

User metadata
Rank Newbie
Rank
Newbie

Hi all,
Had the GF2-MX200 32 Meg back then from Dec-2001 till Aug-2003. I remember constanly freaking out at the lackluster RtCW performance in 32-bit colour 1024x768 (my rig was PIII-1000 with 192 mb RAM). Played it since March 2002, and hey, that was a Q3 game from the fall 2001! So you can only play it in 16-bit colour if you want a decent framerate. Later on, it was all but a grievous experience when you spotted almost every new gaming title with regret, remembering just HOW HORRIBLE it would run on your rig. Obviously, that card won't run any major games at max settings from the second half of 2001 onwards. Exceptions are scarce and they were not regarded as some eye-candy at the time.
And there's more to that, because Hitman:Codename 47 from the late 2000 has an AA option. And the are also Sacrifice and Giants: Citizen Kabuto from the same year, notorious for pushing contemporary hardware to the limits. Considering that GeForce 2 Ultra was state-of-the art at the time, now you perfectly get my point about MX series...
And its image quality was no better than the one I had with TNT2-M64 back in 2000, though you had a whole 2X AF option in 2002 Detonator drivers.
My suggestion is to have a GeForce 4 Ti4600 if you want a great gameplay experience at a full speed up to the end of 2003, or some 9700 Pro for a superior image quality up to the same date (I mean at all the settings maxed out with resolution higher than 1024x768).
Also, Max Payne 2 supports only PS1.4 for its Pixel Shader Skins option, so you won't ever see that thing on any GF4. Just FYI.

P.S. I cannot get past this topic because I can't help but share my horrible memories concerning MX series of nVidia cards... And GF2-MX400 64 Mb wasn't any better (personal experience, but not on my rig).

Last edited by redigger on 2021-02-13, 21:26. Edited 2 times in total.

Core 2 Duo E6550 2.33GHz\PowerColor Radeon X1950 Pro 256 Mb\ 3 Gb RAM\SBLive! 5.1
Core i5 8400 2.8 GHz\RX550 2Gb\8 Gb RAM
Core i5 2540M 2.6 GHz laptop\6 Gb RAM

Reply 13 of 41, by frudi

User metadata
Rank Member
Rank
Member

The MX was always a value card, even when it first launched, and the slightly higher clocked MX 400 model didn't change that in any way. Lots of people used to praise it back then and it wasn't bad value compared to the pricey GTS/Pro/Ti/Ultra models, especially if you were willing to stick to 16-bit colors. But it was never able to handle contemporary games in high resolutions (high resolution back then meaning 1024x768 and up) and especially not in 32-bit colors. Even at 800x600x32 it struggled in some more demanding titles. Back then new cards were launching like every half a year, so for all the people upgrading from TNT/TNT2, Voodoo 2/3 or G200/G400 type cards from just a year or two earlier, the MX seemed like a massive upgrade, hence all the praise. But it was still nowhere close to the performance of a GTS, let alone the faster geforce 2 models.

These days, for retro builds, I would consider the MX line as more of a slightly faster TNT2 alternative with T&L support, not as anything even remotely comparable to any of the fully fledged geforce 2 line (GTS and up). It's suitable for earlier 3D games from the windows 95 to windows 98 era; don't expect to play 1999 or newer games at above 800x600 on it with good framerates.

Reply 14 of 41, by redigger

User metadata
Rank Newbie
Rank
Newbie
frudi wrote on 2021-02-13, 21:22:

Back then new cards were launching like every half a year, so for all the people upgrading from TNT/TNT2, Voodoo 2/3 or G200/G400 type cards from just a year or two earlier, the MX seemed like a massive upgrade, hence all the praise.

You're right. It was just like that, when you can't even run MS Train Simulator in 32-bit colour without getting all kinds of slideshow on TNT2-M64, and then you get it running smoothly on your new GF2MX...

Core 2 Duo E6550 2.33GHz\PowerColor Radeon X1950 Pro 256 Mb\ 3 Gb RAM\SBLive! 5.1
Core i5 8400 2.8 GHz\RX550 2Gb\8 Gb RAM
Core i5 2540M 2.6 GHz laptop\6 Gb RAM

Reply 15 of 41, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

It's suitable for earlier 3D games from the windows 95 to windows 98 era; don't expect to play 1999 or newer games at above 800x600 on it with good framerates.

Overclocked GeForce 2 MX 128-bit is very close to GeForce 2 GTS though. And arguably, you can push 1600x1200 in 16-bit color.

q3-16.gif
q3-32.gif

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 16 of 41, by redigger

User metadata
Rank Newbie
Rank
Newbie
The Serpent Rider wrote on 2021-02-14, 03:02:

Overclocked GeForce 2 MX 128-bit is very close to GeForce 2 GTS though. And arguably, you can push 1600x1200 in 16-bit color.

Well, even in 32-bit colour actually) I used to play Police Tactical Training (2000) in 1600x1200 32-bit on GF-MX200 back then without noticeable lagging. But it has pretty simple visuals...
Actually 1999 is the comfort zone for this card. I used to play Q3 at 1024x768x32 all maxed out.
And at the same time it's lagging at 800x600x32 in Hitman 2 SA and it's almost unplayable in NOLF2 at 1024x768x32 as well.

Core 2 Duo E6550 2.33GHz\PowerColor Radeon X1950 Pro 256 Mb\ 3 Gb RAM\SBLive! 5.1
Core i5 8400 2.8 GHz\RX550 2Gb\8 Gb RAM
Core i5 2540M 2.6 GHz laptop\6 Gb RAM

Reply 17 of 41, by frudi

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2021-02-14, 03:02:

Overclocked GeForce 2 MX 128-bit is very close to GeForce 2 GTS though. And arguably, you can push 1600x1200 in 16-bit color.

And an overclocked Geforce 2 GTS will again eat it for breakfast. Doesn't make much sense to compare one card overclocked to the extreme to another running at stock.

Besides, the MX is mainly memory constrained (as is the GTS for that matter), which the MX400 barely addresses. Most models come with 6 or 5.5 ns memory, some higher end models 5 ns, which will all mostly max out around 200-220 MHz, nowhere near the 250+ MHz shown in those charts. Those scores are purely down to the Gainward Golden Sample using 4 ns memory. So that's some extreme cherry picking going on in those graphs, the Gainward Golden Sample scores are not representative of most MX models at all. And besides, Quake 3 looks ugly in 16-bit colors, the dithering is obnoxiously obvious. You really want to run it in 32-bit mode. Game engines were transitioning from 16 to 32 bit rendering around that time, so the same goes for many games from then onward, especially games that are dark or have a more limited color pallete (Deus Ex and other Unreal engine games say hi).

redigger wrote on 2021-02-14, 09:48:

Actually 1999 is the comfort zone for this card. I used to play Q3 at 1024x768x32 all maxed out.
And at the same time it's lagging at 800x600x32 in Hitman 2 SA and it's almost unplayable in NOLF2 at 1024x768x32 as well.

Depends what one considers unplayable. I used to run an overclocked MX back then (210 on core and memory, about what most MX and MX400 cards would max out at), paired with a Celeron 633 overclocked to 980 MHz. I didn't consider Quake 3 comfortable at 1024x768x32 maxed out, since the fps would regularly drop down into 30s with a lot of on-screen action; something that running a timedemo and getting an average score of 60 fps won't really show you. By the time NOLF2 came out I had upgraded to a Radeon 8500 and a Duron 1400 and even that configuration struggled with NOLF2 at 1024x768x32. So I can't imagine how the MX would be in any way playable at the same settings, unless it was perhaps a Geforce 4 MX.

Reply 18 of 41, by Totempole

User metadata
Rank Member
Rank
Member

The Geforce 2MX is still a great value card for someone looking for a proper Riva TNT2 (Not m64 version) but are not willing to spend crazy amounts of money to get one.

It's quite a bit faster than a TNT2 and offers comparable compatibility with a few exceptions (mostly related to DOS games and early versions of Windows.)

My Retro Gaming PC:
Pentium III 450MHz Katmai Slot 1
Transcend 256MB PC133
Gigabyte GA-6BXC
MSI Geforce 2 MX400 AGP
Ensoniq ES1371 PCI
Sound Blaster AWE64 ISA

Reply 19 of 41, by redigger

User metadata
Rank Newbie
Rank
Newbie
frudi wrote on 2021-02-14, 11:38:

So I can't imagine how the MX would be in any way playable at the same settings, unless it was perhaps a Geforce 4 MX.

Well, it was playable in some way) Actually, I had to cope with constant slowdowns and hiccups to watch the 3D-image of the day (Nov-2002). That was a NOLF2 demo and yes, the card being GF2MX200. Interesting enough, when I got my hands on a full NOLF2, it was Dec-2003 and this time the card being 9000 Pro 128 Mb with 8x AF constantly turned on and in 1600x1200. And it was barely playable again, but nice enough when there was little to no action. So I get your point about 8500 card) I used to be an image quality maniac)

Last edited by redigger on 2021-02-14, 14:46. Edited 2 times in total.

Core 2 Duo E6550 2.33GHz\PowerColor Radeon X1950 Pro 256 Mb\ 3 Gb RAM\SBLive! 5.1
Core i5 8400 2.8 GHz\RX550 2Gb\8 Gb RAM
Core i5 2540M 2.6 GHz laptop\6 Gb RAM