VOGONS


First post, by 386SX

User metadata
Rank l33t
Rank
l33t

Hi,
how many of you were expecting this card to be a great return in the 3D gaming industry? I do remember expecting so much first reviews and I still like this card and its "complexity".
Maybe expectations were as high as with the Savage2000 chip and the Rage Fury Maxx.
Bye

Reply 1 of 8, by Logistics

User metadata
Rank Oldbie
Rank
Oldbie

I have a 128 AGP, and I have always wanted to see one of these output 10-bit. These are some of the earliest examples I can think of that brought 10-bit to consumer grade cards.

Reply 2 of 8, by dionb

User metadata
Rank l33t++
Rank
l33t++

In fact it was a great card, with features that did what they were supposed to and refreshingly few driver issues. Just three problems that killed it commercially stone dead:
- significantly later than the cards it was supposed to out-compete.
- significantly slower (bad yields & scaling) than promised prevented it from taking on the successors to those cards.
- at release it was competitive in performance terms with the Radeon 8500, but failed to beat the Geforce 4-series, and cost more than both. Then within a few weeks, ATi dropped the 9700 bomb, killing it stone dead.

None of those competitors offered triple-head gaming, none offered 10b HDR - but neither did any mainstream games at the time (or for a long time afterwards), so that didn't compensate for the lackluster performance and excessive prices. It was the G400 all over again- with the difference that the G400 may have been overpriced and nobody used its unique features (dualhead and bump mapping) but at least the G400Max was at time of release the fastest video card on the planet - at least for Direct3D. Let's not mention its OpenGL drivers...

Reply 3 of 8, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

It was a flawed GPU from the beginning which was not designed for games. Excessive 4 TMUs per pixel pipe, low clock speed, poor vertex shader performance and lack of any HSR and Z-buffer compression.

It was the G400 all over again

Not really. G400 was clearly designed with gaming in mind and comparable to it's competitors. Drivers were somewhat lacking, but hardware potential was there.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 4 of 8, by swaaye

User metadata
Rank l33t++
Rank
l33t++

It was also released with some significant hardware flaws. Late, incomplete, slow and very expensive.

I think the most interesting aspect to it is the fragment anti-aliasing.

Reply 5 of 8, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote:

It was a flawed GPU from the beginning which was not designed for games. Excessive 4 TMUs per pixel pipe, low clock speed, poor vertex shader performance and lack of any HSR and Z-buffer compression.

I consider it more a result of limited development possibilities. We don't know how manufacturing affected the clock. How appealing are bandwidth saving techniques when you have so much of it? The TMU ratio is really strange, but could they use that multiplicity outside of games?

Reply 6 of 8, by dionb

User metadata
Rank l33t++
Rank
l33t++
The Serpent Rider wrote:

It was a flawed GPU from the beginning which was not designed for games. Excessive 4 TMUs per pixel pipe, low clock speed, poor vertex shader performance and lack of any HSR and Z-buffer compression.

The low clock speed was most definitely not designed, it was supposed to be launched with 50% higher clock speed. The lack of advanced features was supposed to be compensated by raw bandwidth - which is a valid strategy. Of course, if your yields are crap and you can't launch at anything near the expected speeds, this strategy totally backfires.

It was the G400 all over again

Not really. G400 was clearly designed with gaming in mind and comparable to it's competitors. Drivers were somewhat lacking, but hardware potential was there.

In terms of performance vs the rest of the market is was in a very similar position. The causes for disappointing performance compared to expectation were different though.

Spot-on timing given this topic: yesterday I *finally* (after over a month of frustrating haggling and miscommunications) managed to reach an agreement with someone selling a pile of AGP cards - including a beautiful Parhelia 512 128MB. It should arrive on Saturday or so - although I won't be able to even open the package until a week later, and won't have time to do anything with it for another week. Too much travel. But if it works it's getting pride of place in one of my systems - the great thing about retro computing is that you don't need to care about price/performance ratio as when new 😉

Reply 7 of 8, by cxm717

User metadata
Rank Member
Rank
Member

8 or 9 months ago I got 2 Matrox Parhelia cards and a set of 3 1280x1024 19" panels. One is a 128MB retail card (207MHz, AGP4x) and the other is a newer 256MB card (250MHz, AGP 8x). I benchmarked every driver version in a number of games and with a bunch of cpus. Its performance is all over the place. sometimes it's around a radeon 8500 or Geforce3 (Comanche, giants. Those were the games it ran the worst compared to the Radeon and Geforce) and other times it was as fast as a higher clocked Geforce4 Ti (halo, serious sam se). Game performance varied a lot with different drivers also. MDK2 is over 30% faster with driver 105_107. The Parhelia seems to really like fast cpus. It took a Core2@ 2.66GHz to max it out. I also found most (maybe all) drivers had a bug that would cause a hard lock or a BSOD when used with a fast dual core or dual cpu system. This didn't seem to happen with slower dual cores and it happened more often with faster ones (i tested up to a core2@3.4GHz). If you disable a cpu core at boot the card works fine though. Also, I was really surprised the card ran FEAR, STALKER SoC and half life 2 pretty well.

Overall I think it is a really nice card. I have it in an XP system for playing UT99/2k4, Quake3 and Need for speed (hot pursuit 2 and underground)