VOGONS


Matrox Roundup G200, G400, G400 MAX, G450 and G550

Topic actions

First post, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

In this project I am comparing popular AGP Matrox graphics cards with each other. Featured are the following cards:

G200
G400
G400 MAX
G450

I used PowerStrip to disable V-sync on all cards.

The system I used to benchmark this cards is my Slot 1 reference system consisting of:

AOpen AX6BC
Latest BIOS
S370 to Slot 1 adapter
S370 100 MHz FSB Pentium III 1.1 GHz
128MB SDRAM
Windows 98SE

I used the latest Matrox driver.

Settings for benchmarking:

Forsaken: Patch 1.01, Generic 3D Accelerator, 1152 x 864, 32 bit colour, default details, v-sync off, Nuke demo
Quake II: 1024 x 768, default OpenGL render, default details, v-sync off, demo1.dm2
Q3A demo: 800 x 600, High details defaults, v-sync off, Demo 1
Incoming demo: 1024 x 768, 32 bit colour, gameindex.exe
Drakan: 800 x 600, 32 bit colour, default details + bumpmapping, v-sync off
Expendable: EMBM patch, 800 x 600, 32 bit colour, v-sync off, default details + hardware texture compression on + stencil buffers on + disable sound + trilinear filtering on + projected shadows on + bumpmapping on + detail level high, timedemo 1
MDK 2 demo: Default OpenGL driver, 1024 x 768, 32 bit colour, + trilinear, v-sync off
Incoming: Highest default settings, -timetest secret2.dem -width 1024 -height 768, v-sync off

Results for OpenGL:

pEsk8vIl.png

Results for DirectX:

mH9AqJOl.png

Findings:

- In OpenGL the G550 is quite decent, in DirectX it is very slow.
- You would think the G450 is faster than the G400 bit it's not.
- G400 MAX is faster than most cards apart from OpenGL where it loses against the G550.
- Matrox drivers don't have a setting for V-Sync > Used PowerStrip.
- Excellent VGA signal quality.
- Lots of games with EMBM (Environmental Mapped Bump Mapping) support exclusively on Matrox cards

YouTube, Facebook, Website

Reply 1 of 89, by obobskivich

User metadata
Rank l33t
Rank
l33t

Thanks for doing this - I've been wondering how the G550 stacks up to its older brothers for a while. Does EMBM actually work correctly on the 550? Aside from Expendable, what other games support or use EMBM?

Also, can you test something like 3DMark01 or something else that uses pixel shaders on the 550? I've read in some places that it supposedly can do DX8.1, but in others that it cannot - any info about that would be useful imho. 😊

Reply 2 of 89, by meljor

User metadata
Rank Oldbie
Rank
Oldbie

Nice work Phil! (as allways)

What about 3d image quality compared to 3dfx/nvidia? Any abnormal behaviour when it comes to shadows, blockyness or other glitches?

I allways feel like matrox ''misses'' something and the picture quality (and support) is better with the other brands... speed as well.

asus tx97-e, 233mmx, voodoo1, s3 virge ,sb16
asus p5a, k6-3+ @ 550mhz, voodoo2 12mb sli, gf2 gts, awe32
asus p3b-f, p3-700, voodoo3 3500TV agp, awe64
asus tusl2-c, p3-S 1,4ghz, voodoo5 5500, live!
asus a7n8x DL, barton cpu, 6800ultra, Voodoo3 pci, audigy1

Reply 3 of 89, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

Sorry guys this is all the data I have. I did the benchmarks months ago and only now got to put it all together.

I haven't noticed any glitches / bugs / crashes with the latest drivers. Was presently surprised by the experience.

YouTube, Facebook, Website

Reply 4 of 89, by soviet conscript

User metadata
Rank Oldbie
Rank
Oldbie

There's also the G250 which is the overclocked G200A made for OEM only. always wondered how much faster it was then the G200.

As for what other games support EMBM Re: games that use EMBM

Reply 6 of 89, by swaaye

User metadata
Rank l33t++
Rank
l33t++

G200 actually has some 3D quirks that one might not notice. It appears to have some subpixel precision issues for example. You might see textures wobble occasionally.

There is also an OpenGL bug in the latest drivers that breaks transparency. You can use the G400 driver package instead. It contains a newer G200 ICD with this bug fixed.

devius wrote:

Wasn't the G450 supposed to be faster than the G400? Or is this another example of meaningless model names?

As all the old reviews will tell ya, it has lower performance specifications than G400. It's basically a cost reduction with some feature improvements not related to gaming.

There are a number of G400 varieties, including some cheapened models that have 16MB SDRAM instead of 32MB SGRAM. So I see G450 as a replacement to these. It's a updated budget card.

Reply 7 of 89, by Skyscraper

User metadata
Rank l33t
Rank
l33t

I just won a G550 with a ~5 Euro bid (+ ~3 Euro shipping)

This thread got me intrested in seeing what the "late" Matrox cards can do 😀

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 8 of 89, by LunarG

User metadata
Rank Oldbie
Rank
Oldbie
devius wrote:

Wasn't the G450 supposed to be faster than the G400? Or is this another example of meaningless model names?

Everything after the G400 series, like G450, G550, until the release of the Parhelia, was all based on the G400 core, just with changes in manufacture, clock speeds and so on.
The G450 was supposed be a cost-reduced G400, with better clocks than the stock G400, but slower than the MAX.
The G550 was clocked higher, and was a bit of a stop-gap in the wait for the rumored "G800", which never showed up.

I've always had a soft spot for Matrox cards, but I generally prefer the slightly more "gamer" type models (where available), like the G400MAX over things like the G450 and G550 or the Mystique 220 over Millennium II.

WinXP : PIII 1.4GHz, 512MB RAM, 73GB SCSI HDD, Matrox Parhelia, SB Audigy 2.
Win98se : K6-3+ 500MHz, 256MB RAM, 80GB HDD, Matrox Millennium G400 MAX, Voodoo 2, SW1000XG.
DOS6.22 : Intel DX4, 64MB RAM, 1.6GB HDD, Diamond Stealth64 DRAM, GUS 1MB, SB16.

Reply 9 of 89, by sliderider

User metadata
Rank l33t++
Rank
l33t++

G450 is not faster than G400 because Matrox thought that by going to DDR memory they could reduce the memory bus from 128-bit to 64-bit. This actually made the G450 slower in some tests.

Matrox was already well out of step with the mainstream by the time the G550 came around. The G550 is probably the only card I have seen that makes a SiS card look like a piece of high performance gaming gear.

http://techreport.com/review/3165/matrox-g550 … graphics-card/4

The writer goes on to say that out of the cards tested, the G550 has the best visuals with all game options turned up to the max, the problem is that the G550 does not have the horsepower required to pull that load.

Matrox has been a niche market player ever since the first generation GeForce was released because their designs failed to keep up with the competition. They tried to rectify some of the G550's shortcomings with the Parhelia, but it lagged behind the GeForce 4 Ti in performance despite matching it's features and when the Radeon 9700 was released just a few weeks later, it pretty much sealed the Parhelia's fate.

Reply 10 of 89, by LunarG

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:
G450 is not faster than G400 because Matrox thought that by going to DDR memory they could reduce the memory bus from 128-bit to […]
Show full quote

G450 is not faster than G400 because Matrox thought that by going to DDR memory they could reduce the memory bus from 128-bit to 64-bit. This actually made the G450 slower in some tests.

Matrox was already well out of step with the mainstream by the time the G550 came around. The G550 is probably the only card I have seen that makes a SiS card look like a piece of high performance gaming gear.

http://techreport.com/review/3165/matrox-g550 … graphics-card/4

The writer goes on to say that out of the cards tested, the G550 has the best visuals with all game options turned up to the max, the problem is that the G550 does not have the horsepower required to pull that load.

Matrox has been a niche market player ever since the first generation GeForce was released because their designs failed to keep up with the competition. They tried to rectify some of the G550's shortcomings with the Parhelia, but it lagged behind the GeForce 4 Ti in performance despite matching it's features and when the Radeon 9700 was released just a few weeks later, it pretty much sealed the Parhelia's fate.

Ah yes, that's right about the DDR and narrower memory bus, I'd forgotten that. I seem to remember there were some difference in the ramdac as well, to make it cheaper to manufacture, but can't remember the details.
It was a shame that the Parhelia was so sub-par on performance when it came out... I think Matrox made a mistake in focusing on triple-head and such for a "gaming" oriented card. You can't sell a gaming card to professionals, and you can't sell a high-end workstation card to gamers. Matrox didn't wanna have to choose... Aim for both targets! That was never a good idea. They should've sacrificed triple-head and aimed for more pure performance and a lower overall price tag. Then they could've had a competitive card on their hands. I guess the consumer segment is very low profit compared to the professional segment where the buyers will pay much, much more for specialized products and a long-term support network.

WinXP : PIII 1.4GHz, 512MB RAM, 73GB SCSI HDD, Matrox Parhelia, SB Audigy 2.
Win98se : K6-3+ 500MHz, 256MB RAM, 80GB HDD, Matrox Millennium G400 MAX, Voodoo 2, SW1000XG.
DOS6.22 : Intel DX4, 64MB RAM, 1.6GB HDD, Diamond Stealth64 DRAM, GUS 1MB, SB16.

Reply 11 of 89, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I don't think Matrox was ever really gaming focused. It seemed like they just dabbled in the possibilities for a few years before returning to their niches as the development investment of a 3D accelerator skyrocketed.

There were some other GPUs in development at Matrox. G800 and F800 for example. G550 is supposedly an offshoot of G800 (the budget version I imagine). I think what happened was ATI and NV just were too aggressive and had too much money. Matrox lost engineers to them too. This is why GPU companies folded en masse back then and NV and ATI picked up a lot of scraps. Matrox had successful niches that NV and ATI weren't interested in for whatever reason and thus survived.

Reply 12 of 89, by LunarG

User metadata
Rank Oldbie
Rank
Oldbie

The development costs of consumer 3D accelerators were high for sure. And yes, Matrox were never really gaming focused. They had a few cards that were, but only as long as it didn't interfere with their regular business related stuff. The per unit profits of a consumer graphics card, compared to a specialized solution like the ones Matrox makes today, is laughably low. High development costs and relatively low price (a few hundred 🤑 makes the consumer graphics market quite tough, which is why there appears to only be room for two giants.

WinXP : PIII 1.4GHz, 512MB RAM, 73GB SCSI HDD, Matrox Parhelia, SB Audigy 2.
Win98se : K6-3+ 500MHz, 256MB RAM, 80GB HDD, Matrox Millennium G400 MAX, Voodoo 2, SW1000XG.
DOS6.22 : Intel DX4, 64MB RAM, 1.6GB HDD, Diamond Stealth64 DRAM, GUS 1MB, SB16.

Reply 14 of 89, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++
feipoa wrote:

What is the reason why the G200 data is not included on the OpenGL results graph?

Doesn't support it.

YouTube, Facebook, Website

Reply 15 of 89, by Zup

User metadata
Rank Oldbie
Rank
Oldbie

It's funny that some HP servers (gen 😎 have an integrated Matrox G200. I don't think it's mean to show 3D games, but I wonder if their drivers have OpenGL support.

I have traveled across the universe and through the years to find Her.
Sometimes going all the way is just a start...

I'm selling some stuff!

Reply 16 of 89, by feipoa

User metadata
Rank l33t++
Rank
l33t++
philscomputerlab wrote:
feipoa wrote:

What is the reason why the G200 data is not included on the OpenGL results graph?

Doesn't support it.

What doesn't support the Matrox G200 in OpenGL mode? I was able to run Quake II in OpenGL without any noticeable problems. I used driver version 6.28 and Win98SE.

Plan your life wisely, you'll be dead before you know it.

Reply 17 of 89, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

G200's biggest problem was its OpenGL support. Throughout most of its life G200 had to get by, in popular games such as Quake II, with a slow OpenGL-to-Direct3D wrapper driver. This was a layer that translated OpenGL to run on the Direct3D driver. This hurt G200's performance dramatically in these games and caused a lot of controversy over continuing delays and promises from Matrox. [4] In fact, it would not be until well into the life of G200's successor, G400, that the OpenGL driver would finally be mature and fast.

http://en.wikipedia.org/wiki/Matrox_G200

Reading this I didn't bother testing it.

YouTube, Facebook, Website

Reply 18 of 89, by devius

User metadata
Rank Oldbie
Rank
Oldbie
Zup wrote:

It's funny that some HP servers (gen 😎 have an integrated Matrox G200. I don't think it's mean to show 3D games, but I wonder if their drivers have OpenGL support.

The linux G200 driver has 3D acceleration using OpenGL. Not sure if that's what the HP servers are using.

Reply 19 of 89, by Zup

User metadata
Rank Oldbie
Rank
Oldbie

I guess that the integrated G200 is only meant to show basic desktop/console image (but at higher resolutions that the previous ATI ES1000). I don't know if it supports OpenGL at all (that servers have some optional high performance cards, meant to use GPUs in parallel computing) but it's funny that old G200 it's still in use.

I have traveled across the universe and through the years to find Her.
Sometimes going all the way is just a start...

I'm selling some stuff!