VOGONS


First post, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Does anyone have those late Matrox cards? All I ever read about them is how they can play HL2 but they're "not fast STAY AWAY youl'l slow in counter strick!!!" - the general enthusiast opinion about them. No stuff about the drivers and flexibility toward legacy compatibility.

No I don't have them, but I wonder someone here does and wouldn't mind running the usual 96-02 stuff (Hellbender/MTM, LucasArts games, FF7, I76/BZ, Falcon 4, etc) through them.

apsosig.png
long live PCem

Reply 1 of 19, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The problem is they were expensive for years while also sucking. I wouldn't mind having one. They are definitely an interesting gaming curiosity.

Reply 2 of 19, by sprcorreia

User metadata
Rank Oldbie
Rank
Oldbie

I have a Parhelia right here. It was one of my choices for my 2K2 machine, but the Radeon 9700 took the spot.

I can do a bit of testing when i'm finished with my 1996 machine. It should take no more than 2 or 3 days.

Reply 3 of 19, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member

I could be wrong, but it seems like I remember this card being marketed not so much for being fast, but more to do with image quality and also multi-monitor support in games. Basically Matrox's Voodoo 5.

Reply 4 of 19, by feipoa

User metadata
Rank l33t++
Rank
l33t++

I own a few different styles of Matrox cards. They are listed below.

Matrox Millennium G200 PCI
Matrox Millennium G400 MAX AGP
Matrox Millennium G450 PCI
Matrox Millennium G550 AGP
Matrox Parhelia 128 AGP
Matrox Parhelia P256 PCI-X (64-bit/66 MHz, but also works in a PCI slot)
Matrox P650 AGP

From my readings, it seems the Matrox Parhelia 128 AGP is really the next big leap up from the G400 MAX, and is suposed to be vastly superier to anything else they have produced. Matrox decided not to develop Win9x drivers for it though. It will only work with W2K and above. It is keyed such that it should work in older AGP 2X ports, however when I plugged the card into my 1 GHz Asus PV34X, the system beeped a few times and the screen never came up. I have a few other AGP 2X port ports I've been meaning to test it in. The G550 and P650 both worked fine in the AGP 2X ports I tested them in.

The Parhelia P256 PCI-X card is an odd-ball. Surprisingly, it didn't work in the PCI-X slot I hoped it would. It turn on and can be used, but will crash when you try to install the drivers. Matrox was unable to assist. I read in a forum that either Intel (motherboard/BIOS that I'm using) or Matrox need to correctly address this issue with a new BIOS; neither party has been willing to cooporate. The card will, however, work fine with drivers in a non-PCI-X slot.

The only other missing ones I'd consider owning are P690 (PCI) and P750 (AGP). Both are the latest production Matrox PCI/AGP cards.

I'll throw in some test results if I can ever get a board which works with the Parhelia 128 AGP. I don't have an AGP 4X/8X board though (too new for my interest).

Plan your life wisely, you'll be dead before you know it.

Reply 5 of 19, by SquallStrife

User metadata
Rank l33t
Rank
l33t

All I remember about Parhelia is that one of the Aussie tech mags did a write up on how great gaming could be on multiple monitors, but not on such a monumentally sucky card. They included a few photos of a test rig running Neverwinter Nights on three screens, which at the time was OMG HOLY CRAP stuff.

There seem to be a few of the 256MB variety floating around on flea-bay:

http://www.ebay.com.au/itm/270688243961

VogonsDrivers.com | Link | News Thread

Reply 6 of 19, by sliderider

User metadata
Rank l33t++
Rank
l33t++

What I remember is only a few weeks after Parhelia was released, ATi released the Radeon 9700 and that was it for Matrox. Any chance they had at remaining competitive after that went out the window. They bet their future on that card and lost. The only things Parhelia had going for it were 2D image quality (like every Matrox card) and triple head. Other than that it was slower than Radeon 9700 and the shaders turned out not to be fully compatible with the DX9 standard in spite of Matrox' early claims that they were. Between the new Radeon 9700 and the still popular GeForce4 Ti, the Parhelia couldn't find a market. Matrox was relegated to scratching out a space in whatever niche markets it could after that. Parhelia cards are used a lot in the medical profession where 2D image quality is the most important thing.

Reply 7 of 19, by SquallStrife

User metadata
Rank l33t
Rank
l33t
sliderider wrote:

Matrox was relegated to scratching out a space in whatever niche markets it could after that. Parhelia cards are used a lot in the medical profession where 2D image quality is the most important thing.

Matrox makes an 8-head video card too, for... stock brokers I guess?

VogonsDrivers.com | Link | News Thread

Reply 8 of 19, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Their many-output products seem to be their niche. They certainly don't seem to be the #1 choice for OpenGL or D3D, and DVI solved signal quality issues 10 years ago.

sliderider wrote:

What I remember is only a few weeks after Parhelia was released, ATi released the Radeon 9700 and that was it for Matrox. Any chance they had at remaining competitive after that went out the window. They bet their future on that card and lost.

Post G400, they began to lose many engineers to NVIDIA and ATI. Matrox did some suing over that. My impression, while following the MURC forums over those years, was that Matrox had little interest in trying to compete with NVIDIA and ATI.

Reply 9 of 19, by feipoa

User metadata
Rank l33t++
Rank
l33t++
SquallStrife wrote:

There seem to be a few of the 256MB variety floating around on flea-bay:

Unfortunately the 256MB variety of the Parhelia is not keyed to work with 3.3V AGP, only 1.5V. The 128 MB variety is keyed for both both 1.5 V and 3.3 V, so I figured it would have the best possibility of working in an AGP 2X system. I am going to test it out in a variety of boards. My hope was to put it in my Matrox/Cyrix MII-433GP Super7 dream machine, otherwise I'll be stuck with the next best thing: G400 MAX.

EDIT: I tried the double-slotted Parhelia 128 AGP in a SS7 board, FIC PA-2013 which has a 2X slot. The system does turn on, but speedsys crashes 10 seconds into the memory test, however the G550 works fine.

From Wikipedia, "Some cards incorrectly have dual notches", and I beleive this may be the case for the Parhalia 128 AGP which is why the Parhelia 256 AGP is slotted for 1.5 V systems only.

EDIT2: I also tried the Parhelia 128 AGP in another popular SS7 board, an Asus P5A-B, and the screen did not show up. On the other hand, I tested the card in one final SS7 board, a HOT591P rev3.1 and the Parhelia seemed to function properly, at least for the duration of the Speedsys test. If this really is a 1.5 V graphics card, then running it at 3.3 V should eventually damage the Parhelia and some components on the motherboard. Perhaps the HOT-591B is a bit more tolerate to high AGP current? Perhaps this the Parhelia 128 is indeed 3.3 V tolerant but few older 2X motherboards support it for whatever reason?

Plan your life wisely, you'll be dead before you know it.

Reply 10 of 19, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

I put Parhelia APVe in my parents computer. Played some CnC Generals on it, did just fine. It is more then enough for 90's games. I might try more stuff next time.

Reply 11 of 19, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

Their many-output products seem to be their niche. They certainly don't seem to be the #1 choice for OpenGL or D3D, and DVI solved signal quality issues 10 years ago.

sliderider wrote:

What I remember is only a few weeks after Parhelia was released, ATi released the Radeon 9700 and that was it for Matrox. Any chance they had at remaining competitive after that went out the window. They bet their future on that card and lost.

Post G400, they began to lose many engineers to NVIDIA and ATI. Matrox did some suing over that. My impression, while following the MURC forums over those years, was that Matrox had little interest in trying to compete with NVIDIA and ATI.

It sounds like that was the cause of their downfall, then. Not caring what the competition is doing isn't very smart. They must have been interested in competing at some point because the G200/G400 weren't that bad for gaming. When they dropped the 128-bit memory bus for 64-bit in G450 to save money (resulting in lower performance than G400 in spite of using DDR memory) and released the DX 8.1 Parhelia right as the DX9 era was about to kick off, they pretty much lost any fans they had as far as gaming is concerned which is sad really because the G400 tech demo and the Coral Reef demo for Parhelia were both pretty damn impressive and Parhelia brought the possibility of ultra wide screen gaming with it's triple head technology years before ATi and I'm pretty sure Parhelia is still the only triple head option for AGP motherboards.

Parhelia triple head demonstration
http://www.youtube.com/watch?v=Td0TB421M9Q&feature=related

G400 TechDemo in HD
http://www.youtube.com/watch?v=P_YiZzi9PkU

Parhelia Coral Reef Demo HD
http://www.youtube.com/watch?v=qFkZFMQn_wo

Reply 12 of 19, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

G400 was nice chip, but tying with early 99 products was not the hard part. It was the GeForce leap that sorted out competitors and losers. Hard to say if Matrox had a chance at that point. The disappointment was not in more efficient G400 follow up (well I still chuckle when I remember Head Casting), but lack of high end part. The fans got trolled by G800 strings in some driver release and endless silence about future. When Parhelia came, three years after G400 (!), Matrox certainly tried to appeal to gamers as well. DirectX 9 was not to be relevant that soon. And they could still boast some quality advantage as they were first (iirc) with 10 bits per channel.

Reply 13 of 19, by swaaye

User metadata
Rank l33t++
Rank
l33t++

There are a few posters at Beyond3D who followed Matrox's post G400 years very closely and know some details from insiders.

Nappe1 wrote:
Head Casting was in G550. The only software ever which used G550's Vertex Shader. (and the only thing which did materialize from […]
Show full quote

Head Casting was in G550. The only software ever which used G550's Vertex Shader. (and the only thing which did materialize from G800 projects.)

No one (or those who know, do not talk) what were the 2 mysterous G400 followers which were in development: drivers knew them as Fusion and F800.

the years after G400 were quite chaos in Matrox, as Matrox had case going on against nVidia about some workers switching the company and prety wild rumours were flying around what made Matrox to sue Nvidia. The case never went to the end but was most likely dealt with money.

Parhelia again was different beast. It's eventual fate was reason of being 10 months late and (the revision from foundry must have came totally dead.) unable to reach planned clocks by a lot. afaik, Parhelia was planned to 275-300 MHz, it started to ship 200 to 220MHz. later on they did update for it, which shipped 250Mhz but according few users, it was definitely rock solid at 300Mhz. Yet this happened without any press releases 1.5 years after launch and as the everyone was full DX9 then, Matrox faded away.

Nevertheless, Parhelia is still only card providing 3 displays with 2 of them having independent hardware video layer support with full acceleration features for video. That's why it's video sibblings X100 and X1 were and I think still are quite popular in real time WYSIWYG video editing. Another thing is that the mainstream gaming card business goes waaayyy too fast for video specialists. They need to be sure that hardware that they get, is still 100% supported years ahead. Not just the fact that card has non-optimized drivers included in packs which are available 6-12 times a year, but real support and problem solving.

http://forum.beyond3d.com/showthread.php?p=12 … 800#post1219017
http://forum.beyond3d.com/showthread.php?t=92 … t=parhelia+g800
http://www.bluesnews.com/cgi-bin/finger.pl?id … =20020627230700

Reply 14 of 19, by sliderider

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:

G400 was nice chip, but tying with early 99 products was not the hard part. It was the GeForce leap that sorted out competitors and losers. Hard to say if Matrox had a chance at that point. The disappointment was not in more efficient G400 follow up (well I still chuckle when I remember Head Casting), but lack of high end part. The fans got trolled by G800 strings in some driver release and endless silence about future. When Parhelia came, three years after G400 (!), Matrox certainly tried to appeal to gamers as well. DirectX 9 was not to be relevant that soon. And they could still boast some quality advantage as they were first (iirc) with 10 bits per channel.

The point I was trying to make, though, was that high end gamers who wanted the latest technologies would have waited for Radeon 9700 to come out. Those who were happy with DX8 generation features and wanted the best framerates would have opted for some variant of the GeForce 4 Ti so Parhelia was being squeezed by two strong competitors and never really had much chance. Triple head was an appealing feature, but who could have afforded 3 monitors back then? Most people were still using CRT's and they would have taken up too much space and drew too much power.

Reply 15 of 19, by F2bnp

User metadata
Rank l33t
Rank
l33t

That coral reef demo is amazing. Hadn't seen it before!

Reply 16 of 19, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

I see, it sounds feasible. So G800 existed but could not be completed because of important people going to Nvidia. Next generation Parhelia came out too late and did not reach clock targets.
Sure there was a small group of gamers appreciating triple head and CSAA, those serious about flight simulators for example.

Reply 17 of 19, by feipoa

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

Interesting links to read thru. As an oversimplified conclusion, it seems the Parhelia 128 AGP had the ballpark performance of an ATI 8500 or an Nvidia GeForce 3/4, give or take some, and application-dependant.

Plan your life wisely, you'll be dead before you know it.

Reply 18 of 19, by sliderider

User metadata
Rank l33t++
Rank
l33t++
feipoa wrote:
swaaye wrote:

Interesting links to read thru. As an oversimplified conclusion, it seems the Parhelia 128 AGP had the ballpark performance of an ATI 8500 or an Nvidia GeForce 3/4, give or take some, and application-dependant.

John Carmack's blog post was the most telling concerning how disappointing Parhelia really was.

June 25, 2002
-------------
The Matrox Parhelia Report:

The executive summary is that the Parhelia will run Doom, but it is not
performance competitive with Nvidia or ATI.

Driver issue remain, so it is not perfect yet, but I am confident that Matrox
will resolve them.

The performance was really disappointing for the first 256 bit DDR card. I tried to set up a "poster child" case that would stress the memory subsystem above and beyond any driver or triangle level inefficiencies, but I was unable to get it to ever approach the performance of a GF4.

The basic hardware support is good, with fragment flexibility better than GF4 (but not as good as ATI 8500), but it just doesn't keep up in raw performance.

With a die shrink, this chip could probably be a contender, but there are
probably going to be other chips out by then that will completely eclipse
this generation of products.

None of the special features will be really useful for Doom:

The 10 bit color framebuffer is nice, but Doom needs more than 2 bits of
destination alpha when a card only has four texture units, so we can't use it.

Anti aliasing features are nice, but it isn't all that fast in minimum feature
mode, so nobody is going to be turning on AA. The same goes for "surround gaming". While the framerate wouldn't be 1/3 the base, it would still probably be cut in half.

Displacement mapping. Sigh. I am disappointed that the industry is still
pursuing any quad based approaches. Haven't we learned from the stellar
success of 3DO, Saturn, and NV1 that quads really suck? In any case, we can't use any geometry amplification scheme (including ATI's truform) in conjunction with stencil shadow volumes.

Reply 19 of 19, by swaaye

User metadata
Rank l33t++
Rank
l33t++

In some games, it can match a Ti 4400/4600. The game needs to utilize enough texture layers and leverage the 4x4 (4 TMUs per pipe) design of the GPU.

I think it sounds like Doom3's advanced rendering techniques exposed a lot of flaws and inefficiencies in the chip that older games didn't because they were just simpler in their demands of the GPU.