kithylin wrote:
Ah.. sort of.. I've been exploring this for a long time and in my eyes "native Dx9" cards go up to the 400 series. Even though they technically do also support DX-10 and DX-11, the 400 series was the last gpu from nvidia to have a fast dx9 core, starting with the 500 series and onwards the newer cards got progressively slower at DX9 in general. At least that's what my friends report back to me from their 700 and 900 series cards.
The 500 series uses the same "core" as the 400 series - both are nVidia Fermi parts. 😊
Quick'n'dirty web search and here's an example of GTX 580 and 480 in a DX9 game; 580 is (as expected) the faster card:
http://www.guru3d.com/articles_pages/geforce_ … _review,10.html
And other results:
http://www.hardwareheaven.com/2010/11/gigabyt … -sli-review/10/
http://www.hardwareheaven.com/2010/11/gigabyt … -sli-review/12/
http://www.legitreviews.com/nvidia-geforce-gt … d-review_1461/9
http://www.techpowerup.com/reviews/NVIDIA/GeF … GTX_580/19.html
http://www.techpowerup.com/reviews/NVIDIA/GeF … GTX_580/21.html
While I don't have a high-end Fermi card, I can tell you that my GTX 660 (which is Kepler based) has zero problems with any DX9 game I've thrown its way, and is consistently faster than my HD 4870X2 - it's "more faster" in DX10+ tasks, because that's where a lot of its optimizations lie, but it still does better in DX9 too. But I've never really thought about if an older (or newer) card could get me closer to a gazillion FPS in Quake 3 or Half-Life 2 or what-have-you; if I can turn up most/all of the settings and not have lag, I'm happy. 😎
Darkman wrote:well, the 7950GX2 still had issues, eventually after an hour it crashed, gave me a BSOD and proceeded with the problems before, […]
Show full quote
well, the 7950GX2 still had issues, eventually after an hour it crashed, gave me a BSOD and proceeded with the problems before, even worse actually , as the glitches were worse , and Windows would boot even less often.
Eventually I thought "well if this thing is broken , then I might as well try something a little more drastic, the worst that happens is that the card breaks even more" .
So I disassemble the card and reapply new thermal paste to both GPUs . The old paste wasn't turned to powder, but it was caked on and rather dry (certainly not a paste , and it wasn't a good spread), I just used Arctic Silver 5.
so far it works (30 minutes beforehand it didn't work) , but I don't know, maybe its a heat issue?, but I will have to test it further. MSI afterburner gives me a temperature of 58 and 55 for GPUs 1 and 2 respectively while idle , and 72ish for both under load. I will give it a day to see if it crashes.
In other news I got a 300GB Seagate Cheetah 10k SCSI Hard drive, along with an adaptec 2100S SCSI controller card (had 32mb of ECC) SDRAM as cache, I replaced it with 64Mb ECC) . This will be going into my Athlon 1400 to replace the multiple drives I was using before.
On both of my GX2s, they didn't have thermal paste but thermal "pads" (my understanding is it's some sort of foam/cloth that's saturated with paste) - replacing it with AS5 greatly reduced temperatures, and neither seems to run as consistently hot as yours (roughly 10* C lower; my ambient temperature is around 16-18* C though, so that may be a factor). One thing I've found is that they need very good "side" ventilation - the exhaust direction from their fans doesn't go towards the back of the case (and if you have an OEM card with metal "spines" it will run hotter - clip that metal out and it should cool down). With that blocked I can get my cards running at 70-75* C too, but open that up (or even better, have an exhaust fan right over it) and they cool down pretty well. They're also notoriously finnicky as to what systems they'll work in - I don't think nVidia's HCL is also all that accurate, in my A8N-SLI Premium I still get random screen blackouts on start-up and when opening the drivers, and that board is on the HCL (I also know this isn't my card, because aside from having two of them that behave exactly the same, they don't do it on my non-HCL'd RDX200CF-DR (yeah, figure that one: the ATi chipset board does better than the nForce)). Drivers wise, I've tried the original 93.xx that did QuadSLI, the slightly-older 9x.xx that just supports the single GX2, and a very new 29x.xx build; the original QuadSLI drivers seemed the most stable/consistent in terms of minimizing random blackouts and such on the A8N, for whatever that's worth.
Of course it's entirely possible your board has something wrong with it, but I just figured I'd share what I've observed with mine. They're very odd cards indeed.
Oh, and to be more on-topic:
I received a 6800 Ultra of my very own today, but have only done very minimal testing (it booted up and ran Superscape and ASTRA); seems to work. Now just waiting on the motherboard to put it in (should arrive next week). 😀
@ RetroFanatic: If it's an EVGA, card, the EE PN are N346 while normal Ultras are N345. If it's another brand (I don't know if any other OEM did EE) I don't know about serial #s. "Vanilla" Ultra is 400MHz core, while the EE is 450MHz, so you can check once you're in Windows and have GPU-Z or some other application.
On the 9800GX2 - it will probably be faster in earlier games, but more shader-heavy games will probably favor the 280. The 280 also has the advantage of not being SLI, so in games where SLI doesn't help or causes problems it won't suffer. On the other hand, the 9800GX2 should do SLI-AA, which the 280 cannot (at least by itself). Based on the 7950GX2, it should do 8x and 16x modes (32x mode requires 4 GPUs afaik). Give this a look too: www.tomshardware.com/charts/2010-gaming ... d[4522]=on
Heat/power aside, it looks like a winner unless you're unlucky enough to hit upon a game that has problems with SLI. 😵