VOGONS


First post, by quicknick

User metadata
Rank Oldbie
Rank
Oldbie

A heatsink was needed for another project, so I pulled one from an old GF2MX400 card I had in the "non-important" card box. Noticed some shadowing under the markings, and this picked up my interest so I started rubbing the surface with a strong solvent. The re-marking was very well done, in the end I had to scrape with a very sharp razor to get to the relevant bit of the original markings. Turns out the MX400 is really a MX200 in disguise.

I can understand faking high-end chips, but this is a first for me. Anyone experienced such low-end faking?

GF2MX_fake.jpg
Filename
GF2MX_fake.jpg
File size
832.08 KiB
Views
777 views
File license
Public domain

Reply 1 of 8, by debs3759

User metadata
Rank Oldbie
Rank
Oldbie

It could have been a factory remark, if NVidia ran out of MX400 chips and used MX200 chips that passed testing.

See my graphics card database at www.gpuzoo.com
Constantly being worked on. Feel free to message me with any corrections or details of cards you would like me to research and add.

Reply 2 of 8, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

Also, the chip is not even that important. The limiting factor when it comes to MX 200 cards is the 64 bit memory bus, so that card is virtually unusable for higher resolutions.
You can check this with Everest/Aida32/Aida64. A typical MX/MX400 should have 2.6 GB/s memory bandwidth, while an MX200 and even some cards that were marketed as MX400 have 1.3 GB/s effective memory bandwidth.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 3 of 8, by quicknick

User metadata
Rank Oldbie
Rank
Oldbie

Well, the card is 64-bit, says so on the back (was there a MX400-64bit?) so I guess there's no need to check with Everest or Aida.
I think I also remember one of my "FX5500" cards having a 5200 chip under the heatsink...

GF2MX_fake(back).jpg
Filename
GF2MX_fake(back).jpg
File size
319.45 KiB
Views
715 views
File license
Public domain

Reply 4 of 8, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

Well, yeah, there were - like the card in front of you, which was also marketed as an MX400 😁

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 5 of 8, by Doornkaat

User metadata
Rank l33t
Rank
l33t

What is it clocked at?
I mean, it's the same chip and they admit the 64bit bus. Even if it runs at 200MHz it's more of an MX200 OC than an MX400. But technically I guess it's ok?
But if it clocks at 175MHz it's an outright scam. 😅

Reply 6 of 8, by Katmai500

User metadata
Rank Member
Rank
Member
quicknick wrote on 2020-10-12, 12:46:

Well, the card is 64-bit, says so on the back (was there a MX400-64bit?) so I guess there's no need to check with Everest or Aida.
I think I also remember one of my "FX5500" cards having a 5200 chip under the heatsink...
GF2MX_fake(back).jpg

The FX 5200 / 5500 swap is not surprising at all. The FX 5500 was just a rebrand of the FX 5200 NV34 GPU late in the FX series run (March 2004). Some cards shipped with a die-shrunk version of the original NV34 chip from the FX 5200 called NV34B, but it wasn't functionally improved. The FX 5500 had a 20 MHz core clock bump over the regular FX 5200 (270 vs 250), but the same 200 MHz memory clock, and could be found with 64 and 128 bit memory interfaces. So plenty of companies probably shipped FX 5500 cards with FX 5200 GPUs overclocked to 270 MHz.

The FX 5200 Ultra from a year earlier (March 2003) is significantly faster than any FX 5500 with its 325 MHz GPU clock and 325 MHz memory clock on a 128-bit interface. The FX 5500 was just a good way to fill store shelves with cheap 256 MB graphics cards. The FX 5200 had become known as a slow card by 2004, so they added 300 to the name and boom, everyone thinks it's half way between an FX 5200 and an FX 5700.

Here's a period article surrounding the release of the FX 5500: https://techreport.com/news/6347/innovision-a … -graphics-card/

Reply 7 of 8, by frudi

User metadata
Rank Member
Rank
Member
quicknick wrote on 2020-10-12, 12:46:

Well, the card is 64-bit, says so on the back (was there a MX400-64bit?) so I guess there's no need to check with Everest or Aida.

Depends on the memory type used, since GF2MX cards could use either SDR (128 or 64-bit) or DDR (64 or 32-bit) memory. So even if it is a 64-bit model, if it's DDR memory, it'll have the same bandwidth and performance as 128-bit SDR models.

Unfortunately, searching the codes on the memory chips didn't reveal anything useful, so I can't say if they're SDR or DDR. Only thing the markings tell is the chips are 6ns, which means 166 MHz, but that was the stock speed for both MX400 and MX200 models. If you want to make sure, install the card and check through GPU-Z or something similar.

Reply 8 of 8, by darry

User metadata
Rank l33t++
Rank
l33t++
frudi wrote on 2020-10-12, 21:59:
quicknick wrote on 2020-10-12, 12:46:

Well, the card is 64-bit, says so on the back (was there a MX400-64bit?) so I guess there's no need to check with Everest or Aida.

Depends on the memory type used, since GF2MX cards could use either SDR (128 or 64-bit) or DDR (64 or 32-bit) memory. So even if it is a 64-bit model, if it's DDR memory, it'll have the same bandwidth and performance as 128-bit SDR models.

Unfortunately, searching the codes on the memory chips didn't reveal anything useful, so I can't say if they're SDR or DDR. Only thing the markings tell is the chips are 6ns, which means 166 MHz, but that was the stock speed for both MX400 and MX200 models. If you want to make sure, install the card and check through GPU-Z or something similar.

This makes me wonder if a 128-bit DDR version would have been technically possible (though probably frowned upon by Nvidia) and what performance if might have had .

EDIT: Maybe there were .
https://www.gpuzoo.com/GPU-Axle3D/GeForce2_MX … 28-bit_DDR.html