VOGONS


First post, by Mondodimotori

User metadata
Rank Member
Rank
Member

Just as said, I ordered an MX440 from ebay, it was listed as a base model with 64mb of DDR vram and 128bit bus, from "Manli".
Today it arrived and, to my surprise, on the back of it a sticker said "MX440 SE". The back of the card wasn't shown in the announcement and I, naively, trusted it, since it had screenshoots of GPU-z with the card listed as an MX440, and manli did produce a base MX440 with similar PCB and cooler design.
It's still has 64mb of DDR with a 128bit bus as advertised (wich misled me, since the Manli base model also has the same configuration).

I got it for the Pentium III 1000 system to try and put to rest the Ti4200, wich is hanging to life by a thread (and wouldn't be worth to reball again), and I wanted a low powered GPU from the same period that would be a good match for a Pentium III 1000 and a VIA mobo (meaning only 2X AGP on these cards). Thus I decided to get an MX440 with a 128bit bus, since they are much much cheaper than any GF2 or GF3 option out there.

I still decided to test it and, while the fan was a bit too noisy in the beginning, it then normalized, ready from some benchmarks.
I ran two rounds of 3DMark 2000, one with the Ti4200 and one with the MX440 SE. 6400 points with the first, 6000 with the second.

That's not a huge difference, so...

Considering I payid it less than 20€ with shipping, I feel it would be more of an hassle to return it for "wrong description" than keep it, since it does perform decently in that system. I don't even know if it would be possible to overclock it to MX440 levels of performances, considering it has active cooling.

What are your thoughs from your collective wisdom? Is it still worth to keep it, maybe overclock it a little bit to make full use of the Pentium III? Or is just scrap metal and, even if it's an hassle, get a full refund of 17€?

Reply 1 of 15, by havli

User metadata
Rank Oldbie
Rank
Oldbie

I would keep it.
128-bit memory bus means it is not too much cut-down. Slightly lower frequency than regular MX440 non-SE... but still very good GPU for Pentium III. You can try to overclock it to see if performance increases.

MX440 SE reference clock is 250 MHz GPU / 333 MHz DDR.
MX 440 should be 275/400
MX 460 should be 300/550

So not much difference. You can consider yourself lucky for getting the 128-bit version, this is important for good performance. Most of 440(SE) are 64-bit junk 😁

HW museum.cz - my collection of PC hardware

Reply 2 of 15, by Mondodimotori

User metadata
Rank Member
Rank
Member
havli wrote on 2025-09-29, 19:17:
I would keep it. 128-bit memory bus means it is not too much cut-down. Slightly lower frequency than regular MX440 non-SE... bu […]
Show full quote

I would keep it.
128-bit memory bus means it is not too much cut-down. Slightly lower frequency than regular MX440 non-SE... but still very good GPU for Pentium III. You can try to overclock it to see if performance increases.

MX440 SE reference clock is 250 MHz GPU / 333 MHz DDR.
MX 440 should be 275/400
MX 460 should be 300/550

So not much difference. You can consider yourself lucky for getting the 128-bit version, this is important for good performance. Most of 440(SE) are 64-bit junk 😁

Exactly, I took it mainly for the confirmed 128bit bus, that wasn't in doubt. I got burned before on a 64bit card, but that was PCI only, not much else to be found for decent prices.
It didn't loose much performances when compared to the Ti4200 in 3DMark, despite it should be much faster. I thing the PIII is the limiting factor. Either that or the RAM.

I'll maybe try to overclock it a little, since it is indeed 250mhz /333mhz (166mhz DDR).

An MX460 would've been sweeter, but those go even higher than a Ti4200, let alone an MX440.

Reply 3 of 15, by devius

User metadata
Rank Oldbie
Rank
Oldbie

Even if the 10% clock speed difference translated directly to performance difference it would still be minimal and you wouldn't be able to tell anyway, and these things aren't usually that linear.

I'm using a PCI 440MX-8X (yes 8X, and I don't know why they did that) in a similar Pentium III 1GHz and it's plenty fast for the games that run well on that CPU, so the AGP version should pose no problems.

Reply 4 of 15, by Mondodimotori

User metadata
Rank Member
Rank
Member
devius wrote on 2025-09-29, 21:08:

Even if the 10% clock speed difference translated directly to performance difference it would still be minimal and you wouldn't be able to tell anyway, and these things aren't usually that linear.

I'm using a PCI 440MX-8X (yes 8X, and I don't know why they did that) in a similar Pentium III 1GHz and it's plenty fast for the games that run well on that CPU, so the AGP version should pose no problems.

oh yeah, I've read about those 8X 440MX (My Ti4200 is a 8X model), and them being no always great on the memory or bus side. But yeah, on a PIII that will be used to play, at best, Max Payne 1, the difference between the two is negligible. Even more when comparing it to a SE model.
I also have a VIA mobo, so every nVidia card I threw at it will run at 2X AGP no matter what.

This evening I'll try ad OC it with Coolbits, and see if I can squeeze some more mhz out of it.

Reply 5 of 15, by Mondodimotori

User metadata
Rank Member
Rank
Member

Well, tried to OC it, but no success.

At mx440 stock settings (275/400) it crashes 3dmark almost immediately.
Even midway settings still crashes 3dmark.

I had to reinstall the drive to stop the crashes, since it would keep doing so even when resetting factory settings in Coolbits.

After reinstalling the drivers from scratch, it worked no issues.

I think I'll keep it any since a 128bit bus may be harder to find for sure and for cheap.

(this reminded me that I should've tried underclocking the ti4200 to see if it helped with its artifacts)

Reply 6 of 15, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

An MX460 would've been sweeter, but those go even higher than a Ti4200, let alone an MX440.

MX440-8x were quite often equipped with better memory than MX 460.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 7 of 15, by Mondodimotori

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2025-09-30, 20:08:

An MX460 would've been sweeter, but those go even higher than a Ti4200, let alone an MX440.

MX440-8x were quite often equipped with better memory than MX 460.

Oh, did they? I recall reading some old posts about MX440, and it came up that the 8x versions were a gamble, meaning you could get some with decent memory and some with crappy memory.
Now, since these cards are alredy a minefield between different versions and configurations, I wouldn't want to risk another gamble on a 8x... I'll keep this one and use it in the PIII system, maybe I'll move the Ti4200 to another K7 system I would like to build, one with a 12v CPU power connector, or even put up the money and get an higher end FX card.
But nothing stops me from putting up an alert for MX440 8x and MX 460s...

Reply 8 of 15, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Any MX440-8x with BGA memory is usually a good start.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 9 of 15, by Mondodimotori

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2025-10-01, 02:01:

Any MX440-8x with BGA memory is usually a good start.

U mean how the memory is soldered on the board?
Are there any other indications to determine if a card is 64bit or 128bit from external analysis? The sellers not always provided pictures from Everest or other hardware ID programs, and even the pictures of the card itself may not be enough to understand the maker of the memory, nor the total number of memory chips present on the board.

Reply 11 of 15, by Mondodimotori

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2025-10-01, 13:57:

Yup, that's useful if you manage to get a picture from everest of GPU-z (wich I had for the 440X SE and still missed the lower clocks, I though it must've been a mistake of GPU-Z).

The problem is you don't always get those pictures. Once I missed on a GPU because, while the seller was setting up a system to check in the card was 64bit or 128bit, someone else bought it and beat me to it.
The sea of configurations back in the days was astonishing, it's much easier with recent GPUs to get what you want: If I check a 1070ti on ebay, I'm sure of what I'll get.
If I search an older card? I'm pretty sure I saw 64bit Ti4200s out there while searching for one.

Reply 12 of 15, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

You don't need any GPU-Z screenshot. Just a picture of the card. If a seller can't even provide a picture, why even bother?

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 13 of 15, by Mondodimotori

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2025-10-01, 19:57:

You don't need any GPU-Z screenshot. Just a picture of the card. If a seller can't even provide a picture, why even bother?

Can you understand if a model is for sure 64bit or 128 just from pictures? How do you do that?

Reply 14 of 15, by tehsiggi

User metadata
Rank Member
Rank
Member

TSOP Memory with 66 pins is 16bit memory. Multiply that with the amount of chips and you get the memory bus bandwidth.
BGA memory is 144 pins and has 32bit. Same calculation.

However there are some exceptions: You can find Radeon 9600s with 8 BGA memory chips and they're 128Bit, not 256. This is because they use two ranks, effectively sharing each 32bits with two memory chips alternating.

For most cards however, the basic calculation remains 16/32 Bit per Chip * Chipcount.

AGP Card Real Power Consumption
AGP Power monitor - diagnostic hardware tool
Graphics card repair collection

Reply 15 of 15, by Mondodimotori

User metadata
Rank Member
Rank
Member
tehsiggi wrote on 2025-10-02, 04:32:
TSOP Memory with 66 pins is 16bit memory. Multiply that with the amount of chips and you get the memory bus bandwidth. BGA memor […]
Show full quote

TSOP Memory with 66 pins is 16bit memory. Multiply that with the amount of chips and you get the memory bus bandwidth.
BGA memory is 144 pins and has 32bit. Same calculation.

However there are some exceptions: You can find Radeon 9600s with 8 BGA memory chips and they're 128Bit, not 256. This is because they use two ranks, effectively sharing each 32bits with two memory chips alternating.

For most cards however, the basic calculation remains 16/32 Bit per Chip * Chipcount.

Oh yeah, I recall searching the specs of the chip if the pictures on the ad are good enough, then it's just about doing the math.