VOGONS


First post, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

Well, I'm talking about the 16x Fragment Anti-Aliasing originally used by Matrox Parhelia. The AA method only takes "fragment pixels" (pixels on the edge of an object) and then collects 16 sub-pixel samples for AA purpose.

Theoritically, this method means way bigger sample size with less fill rate penalty, especially since fragment pixels typically account for less than ten percent of the total amount of pixels displayed on the scene.

So theoritically, the Fragment AA method allows bigger pixel size with less performance drawback.

Judging from the picture below, this method produces razor sharp edges, which is very clean. I think this is because sample size is always more important than sampling pattern when it goes to AA quality, and edge AA allows way bigger sample size (16x compared to typical 4x or 8x used in conventional FSAA method).

PARHELIA_AA.jpg
(image copied from First Look: Matrox's Parhelia-512 graphic processor article on Tech Report ([url=tttp://www.techreport.com/reviews/2002q2/parhelia/index.x?pg=9]Page 9[/url])).

The question is..... why neither ATI nor nVidia use this AA method? Today's video cards have much more processing power than those of Parhelia's era, so imagine the beauty of 64x Fragment AA or such. Or how about enabling AA with very little performance penalty?

What are the problems with this 16x Fragment AA method, so neither ATI nor nVidia have adopted it? Okay, granted Fragment AA does not eliminate texture shimmering (unlike conventional FSAA, which elminates both edge aliasing and texture crawling), but there is Anisotropic Filtering for such purpose.

I have to admit that I missed Parhelia the first time it came around, but did anyone ever have that card? Based on your experience, did you find many problems with Fragment AA so it's actually not worth it?

Reply 1 of 4, by Reckless

User metadata
Rank Oldbie
Rank
Oldbie

Only the diehard Matrox supporter would have purchased a Parhelia for gaming! I did consider it... had been a long time Matrox user up until then... but given the price it simply wasn't worth it.

I [still] don't use any kind of AA in games as I don't really like the soft feel everything gets. Perhaps Matrox's technology would have been useful... if they'd have had a powerful enough card at a reasonable enough price 😀

Reply 2 of 4, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

I think Anisotropic Filtering may remove most of the blurriness, but at a high performance penalty. Obviously, your mileage may vary.

On the other hand, in the time the parhelia made up its debut, there were already faster cards out there and less pricey.

Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
8 GB GeForce GTX 1070 G1 Gaming (Gigabyte)

Reply 3 of 4, by Snover

User metadata
Rank l33t++
Rank
l33t++

If this technology really is as good as it claims, my guess is that it's patent-encumbered and therefore can't be implemented.

Yes, it’s my fault.

Reply 4 of 4, by Sol_HSA

User metadata
Rank Member
Rank
Member

The primary reason is most likely the patent issues, although I don't know for sure.

I can think of a couple of possible bad sides on the fragment AA;

a) Antialiasing low-resolution textures. If you want to make a, let's say, HUGE starship and texture it without bilinear interpolation to give it that "star wars" look (at least on one texture layer, that is), you probably want to antialias the edges of the texels as well. (I don't really know if anyone's doing this though.. =)

b) Somewhat related - antialiasing of complex shaders inside polygons.

c) Nowadays games tend to have zillions of polygons, and as such it's a bit hard to say which pixels should be given special care on the AA front.

http://iki.fi/sol - my schtuphh