VOGONS


First post, by Babasha

User metadata
Rank Oldbie
Rank
Oldbie

Functional test of Matrox video cards to remind myself (and others) about the specifics and quality of their work. The goal of the test was not to squeeze the max. frames or consider non-standard solutions (such as OpenGL - D3D wrappers), but just go on standard drivers and settings under Win98.

1) Compaq's Matrox Mystique PCI 4MB is one of the company's first attempt in the growing 3D graphics market. A breakthrough in speed, but not always in capabilities relative to competitors in the low-end sector (such as S3 Virge). The speed and 3d functions/games are 2-3 times higher, as is the implementation of many acceleration functions. A huge flaw is missing bilinear filtering (however, if someone likes a good framerates, but pixels a la Playstation 1, there is a charm).

2) Matrox G100 AGP 4MB - this is a variant of "Mystique as it should be." Bilinear filtering has appeared, but the work with alpha blending (transparency of textures) is very strange - it is implemented as a pixel-blank grid. But again at some points it looks very old-school. The games Tombraider 2 and Turok are already quite comparable to ATI Rage Pro in framerate and quality, are interesting picture. The card does NOT support OpenGL (at least without third party wrappers).

3) Matrox G200 AGP 8MB - the first flawless 3D accelerator from Matrox. Visually and in terms of speed, it can compete with Riva TNT, much more interesting than Riva128 and Voodoo 1 (for my taste, at least). There is support for OpenGL without multitexturing and squeezes out its 40 frames per second even on my modest Cel300A.

4) Matrox G450 AGP 32MB - is a best company's cards in terms of quality and even an attempt to become a "trendmaker" with proprietary EMBM technology. Competitor of Riva TNT2. High-quality implementation of 3D functions and more than good OpenGL already with multitexturing.

PS. A separate remark to the supporters of “bullsh@t is your Matrox” - no, it’s not bullsh@t, the company had its own working concept for using these products, which allowed it to outlive many competitors and stay alive until now. Videocards were focused primarily on the multimedia VIDEO segment - work with daughter cards or integrated TV tuners and video capture, image output to TV. Further, excellent quality in 2D, support for legacy systems and alternative operating systems from DOS (Autocad, etc.), Win 3.1 and older and up to Windows XP, workstations and medicine are those areas where Matrox has been and remains traditionally strong.

PSS. Those who wish to discuss/participate are invited to the this thread of the forum . I can also post there (on request) pictures and compare card in more details.

Attachments

Need help? Begin with photo and model of your hardware 😉

Reply 1 of 35, by chinny22

User metadata
Rank l33t++
Rank
l33t++

I used to own a pair of PCI Matrox G450's I grabbed not knowing what they were from a parts box with the intention of harvesting the heatsinks.
Quick google I found out about EMBM so put them in my parts box to play around with another day.
Just recently I sold one that went very quickly which I was surprised and had another look at the specs. As you say roughly a TNT2 and been PCI makes it quite an attractive alternative for boards lacking AGP.
I have the other problem, too many AGP systems so now thinking about getting an AGP G550, although needing the dongle is less then ideal.

I'm not sure how I feel about my Mystique.
I see it more as a dos card, something like a Socket 7 build. However even though the MSI version of Destruction Derby 2 looks better the sound is so broken I prefer software mode. The Screamer games are good, moreso now a fix has been found of the "out of range" issue. I know dos compatibility isn't great but for me only 2 games are affected Jazz and Duke3D.

I'll still default to Nvida cards. I don't really notice the image quality differences, Stronger Direct X support, better dos compatibility But if your looking for something different or a 2nd machine I think Matrox is worth investigating.

Not sure what that last paragraph is about though? I can understand some people that only know DirectX and team red or team green not knowing or understanding Matrox as they backed the wrong tech so never really broke into the gaming sector which may of been a good thing really, so no one took their 3D offerings seriously but even posts from turn of the century gamers were pairing Voodoo 1 and 2 cards with Matrox because their 2D was so great

S3 on the other hand is famous for the Virge been the first to earn the name 3d decelerator despite arguably been a much more useful card for gaming, in dos anyway

Reply 2 of 35, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Mystique was a great cheap Windows GUI card. GUI performance was in the realm of diminishing returns now. The 3D was functional but looked like software rendering. Matrox released M3D, a PowerVR PCX2 card, to offer a stopgap quality 3D rendering product.

G200 was a very fast GUI card and again cheap compared to say Millennium II. And its 3D capabilities are on the same page as a TNT, Voodoo3, Rage128 or Savage3D. But it was Direct3D-only until into the G400's lifetime when they both got OpenGL support in early 2000.

Reply 3 of 35, by leileilol

User metadata
Rank l33t++
Rank
l33t++

G100A was weird in general. The lod calculation for mipmaps is very messed up and is probably the worst part (i.e. seeing one half of a character blurry). I find the 4x4 dither alpha interesting (and there's no bilinear for the alpha channel), and the VGA out's still Matrox sharp, so it's a good Millennium/mystique substitute. Also it gets along with PowerVR PCX2. 😀

Techland's GL->D3D wrapper's what I used to try G100 with OpenGL games many years ago (no MSI on this card)

This ol Midtown Madness shot I did is probably the best summary of the G100A.

Attachments

  • midtown.png
    Filename
    midtown.png
    File size
    165.4 KiB
    Views
    1859 views
    File license
    Fair use/fair dealing exception

apsosig.png
long live PCem

Reply 4 of 35, by Scali

User metadata
Rank l33t
Rank
l33t
chinny22 wrote on 2023-05-02, 16:02:

I'm not sure how I feel about my Mystique.
I see it more as a dos card, something like a Socket 7 build.

In DOS though, Matrox has a somewhat limited implementation of VESA BIOS Extensions. They only expose modes of 640x480 and higher, where you usually want 320x200 or 320x240 truecolour modes for software rendered games.
You'd have to use SCITech Display Doctor to try and squeeze more modes out of it.

Personally I used a Matrox Mystique in the era of Windows NT4. There was no D3D support on NT4 anyway, so the limited 3D acceleration wasn't an issue. For DirectDraw it was very fast and compatible, and it also offered some interesting overlay features for efficient decoding and upscaling of video. Most notably it could do YUV->RGB with bilinear filtered upscaling in hardware.

The G200 and G400 were released in an interesting period... I believe the G200 was actually among the fastest cards at release time... So it looked like Matrox was finally getting a foothold in the rat race that is 3D acceleration.
Then the G400 initially looked good as well, against the nVidia TNT2, both in terms of performance and features.
But the G400 was released at about the same time as the nVidia GeForce, which completely changed the game again (hardware T&L, cubemaps, dot3 per-pixel lighting etc... all sorts of things that Matrox couldn't do, and then there was the raw performance of these beasts). Basically everyone except ATi dropped out of the race at that time.
The G450 was I believe a slightly cheaper and slower version of the original G400 (which also had a G400MAX version that was clocked higher)... then they also released a G550 as another update... And then finally the Parhelia, before they pulled out of the consumer PC market.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 5 of 35, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

MGA Mystique is a real nice one. Drivers for lot's OS (WIN3X, VESA, OS/2, WINNT, WIN9X, NATIVE AMITHLON!) GUI-Accel. is nice and fast, drivers are really good, image quality is excellent.
MGA100 is a problematic one, I have some issues with it, but G200/400 are nice cards! drivers still for most OS (like above). G450 is weaker than G400, lost win3x but still has OS/2, G550 is nearly the same card but offers more performance than the g450.

Talking about 3D on G4xx/5xx it depends on the game-engine if you should better use opengl or d3d. For the most parts the D3D seems to be better/faster than opengl and you don't get a full control over the card. So switching on vsync is not possible. the matrox-tweaking tools offer very less options.

Doc

Retro-Gamer 😀 ...on different machines

Reply 6 of 35, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

If you talk about bilinear filtering....my opinion is that early cards that do not support AF tend to look very blurry with only bilinear.
To my eye have software-rendering like image but fast and smooth is a hugh benefit the only downside is the missing alpha-belnding.
that ruins much more that the missing bilinear. There are some early 3D games and demoscene prods that make use of that
and therefore look ugly. But keep in mind, some very early 3D games support the mystique directly like "REVIL1+2"

Retro-Gamer 😀 ...on different machines

Reply 7 of 35, by Minutemanqvs

User metadata
Rank Member
Rank
Member

I have a Mystique 2Mb (missing the memory expansion) mainly used for testing purposes, a G400 and a G550.

Funny thing is the G400 and G550 have roughly the same performance, one just has a wide memory bus to slow memory while the other has a narrow memory bus to fast memory. I also had lots of issues with the latest G550 drivers provided by Matrox, Direct3D and OpenGL would not work. Going back to the G400-era drivers it works just fine.

And as I said elsewhere, the G200 is everywhere even nowadays. It’s the default « VGA card » implementation in about every server produced.

And I’m still curious to try a Parhelia « first gen », I have an AGP Parhelia-LX somewhere which is basically half a Parhelia-512.

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 9 of 35, by Minutemanqvs

User metadata
Rank Member
Rank
Member

I’m on XP SP3, version 6 drivers from 2008 behave strangely. Version 5.96 from 2006 (supporting G400-550) work fine.

Good ones: https://video.matrox.com/ct/apps/drivers/grap … /download?id=93
Bad ones, which you get by following the Matrox website: https://video.matrox.com/en/apps/drivers/grap … ist=12&osList=5

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 11 of 35, by acl

User metadata
Rank Oldbie
Rank
Oldbie
Minutemanqvs wrote on 2023-05-03, 20:17:

And I’m still curious to try a Parhelia « first gen », I have an AGP Parhelia-LX somewhere which is basically half a Parhelia-512.

It's an interesting card. I have a Parhelia AGP 256MB with box, accessories and documents.

Even if the card was said to be more "gaming oriented", they barely admit it on the box.
The back of the box heavily depicts business use, with only a very very small picture of a triple head setup with Unreal Tournament 2003. Like if gaming was something to be ashamed of. (Picts here https://retro.user-unfriendly.net/Parts ... /pictures/)

I never tried the triple screen at home because I don't own 3 monitors. But I tried the reef tech demo and it looks amazing.

I like matrox a lot in general and I'm trying to collect most of their cards.

I currently collected :

  • Impression Plus (IS-Athena, their first consumer oriented 3d capable card. 1994)
  • Millennium with memory expansion (IS-Storm 1995)
  • Mystique 220
  • Millennium II
  • G450
  • G550
  • Parhelia

G100/G200/G400 are generally easily available and I'm just waiting to find a good offer locally.
The original Mystique appears from time to time.
G400 Max is more difficult to find. I've seen some in cheap lots, but each time the seller didn't wanted to ship the items... Still waiting for this one.

I would love to see more tests and screenshot on this thread. Will try to post some too.

"Hello, my friend. Stay awhile and listen..."
My collection (not up to date)

Reply 12 of 35, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

If there is some interest I can make a tutorial to setup amithlon with mga-myst, and show something like that e.g. https://www.pouet.net/prod.php?which=59136

Retro-Gamer 😀 ...on different machines

Reply 13 of 35, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Parhelia's only advantages were the display outputs. But even there the initial chip had some hardware bugs that caused problems on the secondary outputs.

For games it's usually noticeably slower than a GeForce 3 or Radeon 8500. Radeon 9700 launched right after it did. So other than for that triple head thing, which it was barely fast enough for, it was definitely not a good card for games. It was also very expensive. Trying to get businesses to pay big bucks for it through brand strength and perception.

Also, G550 is the remnant of the aborted G800 project.

And on the topic of the newest drivers, Matrox dropped 3D acceleration support at some point so yeah you want old releases.

The server G200 is interesting but I suspect it's not related to the old G200. GUI acceleration is different these days.

Overall, Matrox didn't have the money to keep up with ATI and Nvidia.

Last edited by swaaye on 2023-05-03, 21:58. Edited 1 time in total.

Reply 14 of 35, by Scali

User metadata
Rank l33t
Rank
l33t
acl wrote on 2023-05-03, 21:24:

Even if the card was said to be more "gaming oriented", they barely admit it on the box.

The Parhelia is very quirky.
It supports Vertex Shader 2.0, and I believe it was actually the first card on the market to do so.
But it 'only' supports Pixel Shader 1.3, which means it's not a 'full' Shader Model 2.0/DX9.0c spec card. It's like some transitional thing between DX8 and DX9-level shaders.
PS1.3 is GeForce4-level, not even as advanced as the PS1.4 you get on a Radeon 8500.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 16 of 35, by Minutemanqvs

User metadata
Rank Member
Rank
Member
swaaye wrote on 2023-05-03, 21:53:
Parhelia's only advantages were the display outputs. But even there the initial chip had some hardware bugs that caused problem […]
Show full quote

Parhelia's only advantages were the display outputs. But even there the initial chip had some hardware bugs that caused problems on the secondary outputs.

For games it's usually noticeably slower than a GeForce 3 or Radeon 8500. Radeon 9700 launched right after it did. So other than for that triple head thing, which it was barely fast enough for, it was definitely not a good card for games. It was also very expensive. Trying to get businesses to pay big bucks for it through brand strength and perception.

Also, G550 is the remnant of the aborted G800 project.

And on the topic of the newest drivers, Matrox dropped 3D acceleration support at some point so yeah you want old releases.

The server G200 is interesting but I suspect it's not related to the old G200. GUI acceleration is different these days.

Overall, Matrox didn't have the money to keep up with ATI and Nvidia.

Uh I didn't know they dropped 3D support at one point...that's a pretty major thing and should be better noted on the driver download page...

Here is the output of an lspci on one of my servers. The G200 "core" is embedded in the HPE iLO, it's not a discrete chip.

c1:00.1 VGA compatible controller: Matrox Electronics Systems Ltd. MGA G200eH3 (rev 02) (prog-if 00 [VGA controller])
DeviceName: Embedded Device
Subsystem: Hewlett Packard Enterprise iLO5 VGA
Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr+ Stepping- SERR+ FastB2B- DisINTx-
Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx-
Latency: 0, Cache Line Size: 64 bytes
Interrupt: pin B routed to IRQ 54
NUMA node: 0
IOMMU group: 16
Region 0: Memory at cc000000 (32-bit, prefetchable) [size=16M]
Region 1: Memory at cdba0000 (32-bit, non-prefetchable) [size=16K]
Region 2: Memory at cd000000 (32-bit, non-prefetchable) [size=8M]
Expansion ROM at 000c0000 [virtual] [disabled] [size=128K]
Capabilities: [a8] Power Management version 3
Flags: PMEClk- DSI+ D1- D2- AuxCurrent=0mA PME(D0-,D1-,D2-,D3hot-,D3cold-)
Status: D0 NoSoftRst- PME-Enable- DSel=0 DScale=0 PME-
Capabilities: [b0] MSI: Enable- Count=1/1 Maskable- 64bit+
Address: 0000000000000000 Data: 0000
Capabilities: [c0] Express (v2) Legacy Endpoint, MSI 00
DevCap: MaxPayload 128 bytes, PhantFunc 0, Latency L0s unlimited, L1 unlimited
ExtTag- AttnBtn- AttnInd- PwrInd- RBE+ FLReset-
DevCtl: CorrErr- NonFatalErr+ FatalErr+ UnsupReq-
RlxdOrd- ExtTag- PhantFunc- AuxPwr- NoSnoop-
MaxPayload 128 bytes, MaxReadReq 128 bytes
DevSta: CorrErr+ NonFatalErr- FatalErr- UnsupReq+ AuxPwr- TransPend-
LnkCap: Port #0, Speed 5GT/s, Width x1, ASPM L0s, Exit Latency L0s <4us
ClockPM- Surprise- LLActRep- BwNot- ASPMOptComp-
LnkCtl: ASPM Disabled; RCB 64 bytes, Disabled- CommClk+
ExtSynch- ClockPM- AutWidDis- BWInt- AutBWInt-
LnkSta: Speed 5GT/s (ok), Width x1 (ok)
TrErr- Train- SlotClk+ DLActive- BWMgmt- ABWMgmt-
DevCap2: Completion Timeout: Range ABCD, TimeoutDis+ NROPrPrP- LTR-
10BitTagComp- 10BitTagReq- OBFF Not Supported, ExtFmt- EETLPPrefix-
EmergencyPowerReduction Not Supported, EmergencyPowerReductionInit-
FRS-
AtomicOpsCap: 32bit- 64bit- 128bitCAS-
DevCtl2: Completion Timeout: 50us to 50ms, TimeoutDis- LTR- OBFF Disabled,
AtomicOpsCtl: ReqEn-
LnkSta2: Current De-emphasis Level: -3.5dB, EqualizationComplete- EqualizationPhase1-
EqualizationPhase2- EqualizationPhase3- LinkEqualizationRequest-
Retimer- 2Retimers- CrosslinkRes: unsupported
Capabilities: [100 v2] Advanced Error Reporting
UESta: DLP- SDES- TLP- FCP- CmpltTO- CmpltAbrt- UnxCmplt- RxOF- MalfTLP- ECRC- UnsupReq- ACSViol-
UEMsk: DLP- SDES- TLP- FCP- CmpltTO- CmpltAbrt- UnxCmplt- RxOF- MalfTLP- ECRC- UnsupReq- ACSViol-
UESvrt: DLP- SDES- TLP+ FCP- CmpltTO- CmpltAbrt- UnxCmplt- RxOF- MalfTLP- ECRC- UnsupReq- ACSViol-
CESta: RxErr- BadTLP- BadDLLP- Rollover- Timeout- AdvNonFatalErr+
CEMsk: RxErr+ BadTLP+ BadDLLP+ Rollover+ Timeout+ AdvNonFatalErr+
AERCap: First Error Pointer: 00, ECRCGenCap- ECRCGenEn- ECRCChkCap- ECRCChkEn-
MultHdrRecCap- MultHdrRecEn- TLPPfxPres- HdrLogCap-
HeaderLog: 00000000 00000000 00000000 00000000
Kernel driver in use: mgag200
Kernel modules: mgag200

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 17 of 35, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote on 2023-05-03, 21:53:

Parhelia's only advantages were the display outputs. But even there the initial chip had some hardware bugs that caused problems on the secondary outputs.

For games it's usually noticeably slower than a GeForce 3 or Radeon 8500. Radeon 9700 launched right after it did. So other than for that triple head thing, which it was barely fast enough for, it was definitely not a good card for games. It was also very expensive. Trying to get businesses to pay big bucks for it through brand strength and perception.

Hey, it does have that rather interesting fragment AA though! When it works, it's actually fairly impressive and can exceed GF3 and Radeon 8500 with comparable IQ and whatever AA they use ( I believe 8500 is SSAA only?).
It's obviously no match in raw performance.

Reply 18 of 35, by Babasha

User metadata
Rank Oldbie
Rank
Oldbie
dr.zeissler wrote on 2023-05-03, 21:34:

If there is some interest I can make a tutorial to setup amithlon with mga-myst, and show something like that e.g. https://www.pouet.net/prod.php?which=59136

Hi!

Pls make a manual. I got mystique and other matrox’es.

Need help? Begin with photo and model of your hardware 😉

Reply 19 of 35, by Scali

User metadata
Rank l33t
Rank
l33t

In other news:
Matrox has just announced that they will be making 'regular' videocards again (albeit still with 4 outputs, so aimed at multihead). This time they are using Intel GPUs:
https://video.matrox.com/en/products/graphics … rds/luma-series

(They had used AMD GPUs in the past, and their D-series uses nVidia).

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/