ATI does not 'cheat', their R200 architecture simply does not support Trilinear filtering and Anisotropic filtering together. The result of this is a slight loss in filtering quality but nothing major. They did 'cheat' with texture quality overall in some drivers (the Quack debacle) but those 'optimizations' were quickly rolled back.
That article is full of shit wrt the R300, by the way. I was around at the time of R300 anisotropic filtering optimizations, and those optimizations used adaptive levels of filtering on vertical surfaces, the resulting IQ difference was negiligble. Regardless, they quickly added an optimization toggle switch to the drivers to disable that! The website's lack of any meaningful screenshot comparisons is testament to that.
Saying ATI 'cheated' with those adaptive filtering optimizations is like saying EVERYONE cheats today with adaptive anti aliasing algorithms, tesselation optimizations etc. Did nVidia cheat with their shitty anti aliasing as well when the 4x nvidia AA was equivalent to the 2x ATI AA? Especially when you had no 'Quality' setting at all?
https://www.pcper.com/reviews/Graphics-Cards/ … liasing-Quality
We can see that at 2x anti-aliasing on the Radeon 9700 is as good as the 4x anti-aliasing on the NVIDIA cards.
No, they simply optimized differently. No card has PERFECT IQ.
Before you post clickbait articles from Mr. Nobody websites, here's a link from Anandtech:
https://www.anandtech.com/show/970/14
It's important to note that in most cases (such as the one above), you won't be able to tell any difference between ATI's performance and quality anisotropic filtering settings.
Retronautics: A digital gallery of my retro computers, hardware and projects.