VOGONS


Reply 20 of 35, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Minutemanqvs wrote on 2023-05-04, 06:31:

Here is the output of an lspci on one of my servers. The G200 "core" is embedded in the HPE iLO, it's not a discrete chip.

https://www.phoronix.com/news/Matrox-G200-DRM-Driver

Actually it sounds like that server G200e may be very similar to the original G200. Just did a search for that mgag200 kernel module.

Reply 21 of 35, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Garrett W wrote on 2023-05-04, 07:10:

Hey, it does have that rather interesting fragment AA though! When it works, it's actually fairly impressive and can exceed GF3 and Radeon 8500 with comparable IQ and whatever AA they use ( I believe 8500 is SSAA only?).
It's obviously no match in raw performance.

Yeah the Parhelia's fragment AA is much higher quality than what Radeon 8500 and GeForce 3 can do. It also has the option for 4X SSAA, and it has good anisotropic filtering support.

I seem to remember these settings disappearing in Parhelia drivers though. Like they basically stopped targeting game support after the first year.

Reply 22 of 35, by Scali

User metadata
Rank l33t
Rank
l33t

Yea, 8500 is only SSAA, and GF3/4 only has some hackish not-quite SSAA solutions ('quincunx') that didn't really perform much better than straight SSAA (which reminds me of a long and tiresome """discussion""" I once had with some fanboy who insisted it was multisample AA, because it took multiple samples... Yea, but it's not MSAA as we know it).

I'm not entirely sure about the workings of FAA, but it seems to be basically the same as the MSAA we've come to know in the DX9 era, first introduced with the Radeon 9700.
That is, you multisample your z-buffer, which allows you to detect polygon edges. You only run the pixel shader once, when there are no edges, because all pixels belong to the same polygon anyway. Only at the edges will you run the shader for every sample, and then blend together with the pixels already in the buffer from other polygons.
Since only a small amount of pixels are near edges, that's where the huge performance savings come from (and which 'quincunx' couldn't do... afaik there was only an optimization with texture fetches, but the full shader was still run for each sample... therefore it was a bit of a hack, and the results would be somewhat blurry).

Here's some background info and benchmarks:
https://www.anandtech.com/show/911/7
https://www.anandtech.com/show/936/14

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 23 of 35, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie

Unfortunately, it does miss a lot of surfaces, so maybe there's more to it. Swaaye has a great video up on his channel playing various titles and you can see the card in action. I seem to remember telephone booths in Half-Life 2 not getting the AA treatment.

Reply 24 of 35, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Quincunx uses 2X MSAA combined with (I think) a full image filter later in the display pipeline. This was quite a popular thing for XBox and PS3 games. It is blurrier but it has benefits similar to SSAA while being the same speed as just 2X MSAA. On the PC, if you crank the resolution up it can be quite nice. Actually that might be comparable to what NVidia's DSR does these days, with its optional "smoothness" filtering.

Reply 25 of 35, by Scali

User metadata
Rank l33t
Rank
l33t

Well, from what I can make out based on nVidia's documentation, they constantly talk about texture samples. Not about pixel shader results. Which I interpret as the pixel shader being run for every sample, and the optimization being that texture fetches don't have to be done every time (which made more sense in the early days, as shaders were simple, and texture sampling took up most of the time... as opposed to SM2.0, with much more complicated shaders, and using more arithmetic instead of texture lookups even for simple operations like normalizing a vector).
Another thing that is indeed 'multisample' is that it has a multisample buffer in which it stores the results, which it resolves on-the-fly, from what I understood in the RAMDAC itself. As opposed to classic SSAA where the whole framebuffer is rendered at a higher resolution and then downsampled as a whole.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 26 of 35, by Minutemanqvs

User metadata
Rank Member
Rank
Member

Today I finally got a Matrox Parhelia 512 in my collection. I ran a couple of benchmarks on my Athlon MP 2200+ system with 2GB of RAM and running on Windows 2000.

IMG-0478.jpg
IMG-0479.jpg

Matrox G400 - 1999, drivers 5.96 with 3D support:
3DMark 2000: 2921
3DMark 2001: 1183
Quake 3 1280x1024: 14 fps

Matrox Millenium P750 (Parhelia LX) - 2003, drivers 2.13 (BSOD in DX9), drivers 1.13 (ok):
3DMark 2000: 4549
3DMark 2001: 4421
Quake 3 1280x1024: 50 fps

Matrox Parhelia 512 128MB, 2003, drivers 1.13:
3DMark 2000: 8248
3DMark 2001: 6732
Quake 3 1280x1024: 94 fps

Radeon 9600 XT - 2003, drivers Catalyst 5.13:
3DMark 2000: 9542
3DMark 2001: 9533
Quake 3 1280x1024: 143 fps

Some notes:
- The Parhelia LX is basically "half" a Parhelia 512 but is easily available
- The most recent "new branch" Parhelia drivers give ma a BSOD under 3DMark 2001
- A simple Radeon 9600 (non pro/XT) is probably what matches a full Parhelia 512 the best, with proper DX9 support
- I had to set AGP Aperture Size to a minimum of 128MB in my BIOS for the P750 to be happy under 3DMark 2001, otherwise it would give me a "not enough video memory error"
- I didn't test my Matrox G550, it has basically the same performance as the G400

Still, it is a fun card to have and I'll keep it in that system.

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 29 of 35, by Minutemanqvs

User metadata
Rank Member
Rank
Member
agent_x007 wrote on 2023-06-19, 17:38:

Parhelia 512 may get firmware updates from Matrox (not sure if they help in your case)
Also, there are two revisions :
Early one with AGP 3.3V support, and later one with AGP 1.5V only.

Yep it was the first thing I did on all Matrox cards I get my hands on: firmware updates. And yes I specifically wanted the first Parhelia generation which is 3.3 and 1.5v (the LX is only 1.5).

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 30 of 35, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

The cooler is very loud on my parhelia...never got a small and better aftermarket cooling solution...
putting a big heatsink cooling solution ruins the design of the card.

Retro-Gamer 😀 ...on different machines

Reply 31 of 35, by Minutemanqvs

User metadata
Rank Member
Rank
Member

I wonder why Matrox is the only « mass » manufacturer that actually released firmwares to the public for about all their cards. Either they were extremely buggy (I don’t recall that being the case) or they have a better support than usual.

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 32 of 35, by Scali

User metadata
Rank l33t
Rank
l33t
Minutemanqvs wrote on 2023-06-19, 17:56:

I wonder why Matrox is the only « mass » manufacturer that actually released firmwares to the public for about all their cards. Either they were extremely buggy (I don’t recall that being the case) or they have a better support than usual.

My guess it's because Matrox is also the only manufacturer who makes the entire card: GPU, board, cooler, memory, everything.
So they know the exact specs of their hardware, and as such there's no risk that a firmware won't work on certain boards because they have a different design, different spec of memory, different cooler or whatever.
In that sense it's more similar to a BIOS update for a motherboard, which is very common.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 33 of 35, by TeaMonster

User metadata
Rank Newbie
Rank
Newbie

I am testing a Mystique that a friend has loaned me. The code on the back is MYST/481. I was going to use it for capturing old VHS tapes in my collection. The card I have does NOT have the Rainbow Runner daughter board. Will it still be able to capture video? I have drivers for Win2K and it shows up and installs, but I can't get any capture software to see it. Am I wasting my time?

Reply 34 of 35, by Babasha

User metadata
Rank Oldbie
Rank
Oldbie
TeaMonster wrote on 2023-12-09, 20:43:

I am testing a Mystique that a friend has loaned me. The code on the back is MYST/481. I was going to use it for capturing old VHS tapes in my collection. The card I have does NOT have the Rainbow Runner daughter board. Will it still be able to capture video? I have drivers for Win2K and it shows up and installs, but I can't get any capture software to see it. Am I wasting my time?

Without RRunner it useless for capture

Need help? Begin with photo and model of your hardware 😉

Reply 35 of 35, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

I recently took out my G550 to replace it with a G450. The G450 has a better compatibility (EMBM in expendable works).
The G450 is best at 800x600 16Bit, if you switch to 32Bit the frames will drop up to 35%. That would be no problem because 16Bit looks really good, but there is one issue. You get black wobbeling lines on texture edges. Don't know if this is due to lightning or something else. In 32Bit it seems to be gone on the cost of much less frames.

To me the most important things are image-quality and driver support. therefore matrox is my favorite, even if 3D performance and featureset (no vsync in ogl) is much worse than the competitors.

Retro-Gamer 😀 ...on different machines