I agree with Putas and GeorgeMan - Scali's posts in this thread are cherry-picked and very biased towards nVidia. I would go as far as saying he's shilling for the GeForce 900 series. He has also cherry-picked numbers and data to create various strawman and other fallacious arguments (e.g. "people would not..." "people will..." etc -> which people, where, when, how, based on what, etc). All of the "think about it HMMM" stuff that seems to imply AMD "ripped off" Fermi, "ripped off" Mantle, etc is also entirely unsourced and unsubtantiated as well. It's a lot of empty conjecture and postualation, some of which borders on conspiracy theory. Coupled with that, he has gotten incredibly defensive when questioned about it, often just crawling back to "I am Scali, therefore I am right" ("I am a developer READ WHAT I WRITE") - that's not valid evidence or sourcing to make many of the arguments he's attempted to put forth in this thread. Even if there weren't large blocks of [Citation Needed] for his claims and assertions in this thread, simply insisting he's right because he's right is not good enough. Scali, I would ask you to please use the multi-quote and edit features in future replies as your multi-posting is very hard to read and follow, but it largely doesn't matter as you're going on my ignore list after this post, because I'm tired of the consistentently abusive and elitist attitude you've displayed on these forums in the last few months.
A few things to untangle:
- 5900XT was released as a competitor for 9600Pro and XT, and they were under $200. I bought mine for $174 (I don't know why I remember that number, I just always have). It is based on NV35, like the Ultra, but it is not the same as the Ultra (it has 128MB RAM, is clocked slower, etc). If memory serves, much like the 6800XT and 7900GS that came after, they were readily on sale for around $150 at many times. It's cherry-picked to provide on-sale prices for the 9600s (at $100; release SRP for the 9600Pro was $169-$199 (source: TechReport), and 9600XT at around the same, or somewhat higher (source: VR-Zone, listing it at 150 GBP, and TweakTown, listing it at "in the same range as 9600Pro")). Performance wise it isn't fair to say that "5900XT was slaughtered" imho - specifically focusing on Half-Life 2 is not the entire story though, especially when Half-Life 2 came out a year later (at which time GeForce 6 and Radeon X were available). Here are some reviews of 5900XT/SE ("SE" is the same card - some vendors, like eVGA, just went with SE as opposed to XT; I think it's a regionalized thing), linked to the first page of benchmarks (for convenience):
http://techreport.com/review/5990/nvidia-gefo … x-5900-xt-gpu/5
http://hothardware.com/reviews/aopen-geforce- … t-review?page=3
Interpret the data as you will. From owning a 5900XT "back in the day" (I had a 9600Pro and 9700np too - does that mean I get into the special decoder ring club?), it was perfectly fine for games that I was actually playing 2002-2004. By the time Half-Life 2 rolled around (in late '04), it was increasingly less competent, and replaced with a 6800GT. The Radeon cards were also fine, but their drivers at the time were a weak point (especially for multi-monitor systems). That was my reasoning for going with 6800 instead of X800. From more recent experiences with my 9550, 9800, and X850s (both the 9600 and 9700 died early, as many R3xx cards seem to do; the 5900XT still survives to this day) the drivers have improved quite a bit from Catalyst 3.x, which is welcome.
- GCN supports DX12. That isn't just Fury. That isn't just 285. All GCN parts support DX12. The few benchmarks I've seen from Futuremark indicate very good things even for the older GCN parts in terms of efficiency, and that nVidia likely has some catch-up on their drivers (in some benchmarks the R9 270 ends up faster than GTX 980 - that's likely driver-related as the GTX 980 should be faster). This will all probably be sorted by next month after Windows 10 has launched, and if it isn't, it will likely be sorted by the time we actually see DX12 games.
Sources:
http://www.legitreviews.com/amd-says-gcn-prod … l-coming_137794
http://hexus.net/gaming/news/pc/67721-microso … tx-12-gdc-2014/
This may be why the claim is made that GCN "doesn't support DX12":
http://www.computerbase.de/2015-06/directx-12 … level-12-0-gcn/
That article has been turned into clickbait (e.g. on Guru3D, WCCFTech, etc (example: http://www.overclock.net/t/1558938/wccftech-a … -on-gcn-1-1-1-2)) with titles like "GCN does not fully support DX12" when in reality it's that GCN does not support DX12.1. Also worth noting is the evidence for this is from an nVidia PR presentation - this takes me back to "Scali is shilling for GeForce 900 series" as the "GCN isn't DX12 part, nVidia has market dominance" is almost verbatim from nVidia's PR presentation.
For further clarity, nVidia have also listed Fermi and up as being DX12 compatible. If I'm not mistaken this leaves Terrascale (Radeon 5000/6000) as the only DX11 cards that won't be supported in DX12.
For DX12 benchmarks, here's an example of what I'm referencing:
http://www.pcper.com/reviews/Graphics-Cards/3 … X12-Performance
It is important to note this is NOT real-world application testing, it is an API overhead performance test. I looked for the one with the lower-spec parts but could not find it (it's based on the same Futuremark benchmark).
- I've yet to find anything about DX12.1 that isn't from nVidia, so it's either an nVidia-specific extension to DX12 (e.g. like DX9a) or it's a minor addendum (e.g. like DX10.1). Either way it appears the GeForce 900 series are the only thing that support it, and if that's the case, it's unlikely to be very important in the long run as obtuse/narrowly supported features tend to be passed over (e.g. like DX9a or 10.1, or other things like TerraScale or Ultra Shadow). Of course history may prove this assumption wrong, but that's my guess. The Overclock.net link above includes slides from an nVidia PR presentation that shows a few features for DX12 and 12.1; perhaps others can find more about this.
- As far as DX9 on NV3x/R300/whatever - Half-Life 2 is a bad example; it was optimized heavily for the R3xx architecture. Other, early DX9 games generally run comparably between the FX 5800/5900 and Radeon 9700/9800, like Halo, Tomb Raider: AoD, The Sims 2, and Gun Metal (some of this is visible in the 5900XT links I provided above). That said, neither series of cards is what I'd consider competent for DX9 era games (late 2004 into 2005 and beyond) - I'd really rather see an SM3.0 part with higher performance, like Radeon X1800/1900 or GeForce 7800/7900. Radeon 9 and X aren't bad cards by any means, but they're much better suited to things from the early 2000s like games based on Quake 3 and Unreal Engine 2.x. That said, from the perspective of building a retro machine, the GeForce FX (and GeForce 4) have some additional advantages, like being universal AGP cards (there are universal Radeon 9 cards, but not all Radeon 9 cards are universal), supporting palletized textures, and having working fog table in Windows 9x. This doesn't mean that in 2003 the Radeon 9700/9800 wasn't the latest-and-greatest, but with the benefit of hindsight and time we don't have to settle with a 9800 for Half-Life 2 or Doom 3 or whatever - we can get something much faster (like the HD 2900XT that started this thread). This leaves the Radeon 9/X in kind of a weird position from the perspective of building a retro machine imho. As far as the VIVO thing - afaik it was up to the IHV to decide whether or not to implement, and I can tell you my 9800 and X850XTP both feature it, but my 9550 does not (I don't remember my 9700 offering it, but I know my 9600Pro had the cables for it - never tested it though). Here's the link to my 9800's product page from Asus:
http://www.asus.com/Graphics-Cards/A9800PROTVD256M/ (and as far as "is it a Pro or an XT?" -> it will identify itself as R360 clocked at something like 400MHz, the board says 9800Pro but many software applications will say XT, and beyond that I don't know)