VOGONS

Common searches


Reply 40 of 51, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

I've never seen the ones that use two molex connectors... but then again, I've always made my own adapters for such things.

Without getting into a long explanation, I know why they make 'em like that... it's to split the load over more than one wire. But their reasoning behind it is flawed, since both molex connectors would be fed by the same wire coming from the PSU anyway. The short of it is: if you get one of those type of adapters, just hook up one molex connector and tape off the other one... it won't cause a problem to do so.

Reply 41 of 51, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I have only seen the dual molex -> PCIe 6-pin variety.

I believe they use dual connections because of connector resistance. The HDD molex plug is rated for 11A/pin at 12v though and PCIe 6 pin is 75W max. But perhaps they are worried about corroded connectors.

Reply 42 of 51, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

I looked around a bit, and it does indeed appear the double-molex version is the most common type. I've just never paid much attention... like I said, I've always made my own.

But... as per the PCIe spec, a card that only requires one 6-pin power connector can draw an absolute maximum of 150W. Theoretically about half of the power should be coming from the PCIe slot and the other half from the auxiliary power connector, but not all cards divide the load evenly. Even so, I wouldn't figure on more than 100W ever coming through the auxiliary power connector, which works out to about 8.3A... a single molex will handle that without any problem.

Reply 43 of 51, by Rekrul

User metadata
Rank Member
Rank
Member

I've been reading the customer reviews of the various cards on Amazon and NewEgg and I still don't know what to get.

I have two questions;

1. I know this might be a silly question, but I'm used to dealing with video cards that don't even need a fan on them; Is there even the slightest chance that any of these cards could actually burn out and catch fire? All the talk of overheating makes me a little nervous since I normally leave my system on 24/7, even when I'm not home. I just don't want to take any chances.

2. I've heard that ATI cards/drivers don't have the best backwards compatibility. I also recall hearing that they didn't handle some graphic effects as well as NVidia cards. Is this still true, or have they gotten to a point where it really doesn't matter which company you choose? An old example that comes to mind is in the game Thief II, where it was said that ATI cards didn't handle fog properly. On the other hand, NVidia's later drivers didn't always handle older games all that well either. I've had issues with Lithtech games, games based on the original Unreal engine, graphical glitches in the Thief games, etc. I don't want to get an ATI card and then find out that it won't handle things that an equivalent NVidia card would have.

Reply 44 of 51, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Temperature:
Modern 3D cards get hot when working hard on a complex 3D game, but consume little power otherwise because of various power management techniques. They also have overheat protection features so a total meltdown isn't going to happen.

You can still find fanless cards. Newegg has a way to specifically find them in their advanced search. These cards typically have large heatpipe-equipped heatsinks and of course need at least some case airflow.

Fans on modern cards tend to be reliable in my experience. I have not seen a motor die in a long time. I have had cards with fan motors that get noisy however.

Compatibility:
When both NV and AMD went into the DX10+ era, they dumped a lot of backward compatibility features. Both companies' cards will have various problems with games using DirectX 6 and older.

NVIDIA cards tend to still have the best game developer support but AMD isn't at an obvious disadvantage anymore. I run both NV and AMD cards. There have however been occasions where AMD has completely dropped the ball. For example, RAGE was almost unplayable on AMD cards for 2 months because of OpenGL problems that were not resolved before game release.

Also, if you want to use any apps that use CUDA, obviously you want NV. There's also the NV Physx consideration, but frankly none of that stuff has ever impressed me so I don't worry about it.

Reply 45 of 51, by Rekrul

User metadata
Rank Member
Rank
Member
swaaye wrote:

Modern 3D cards get hot when working hard on a complex 3D game, but consume little power otherwise because of various power management techniques. They also have overheat protection features so a total meltdown isn't going to happen.

Good to know. Thanks.

swaaye wrote:

When both NV and AMD went into the DX10+ era, they dumped a lot of backward compatibility features. Both companies' cards will have various problems with games using DirectX 6 and older.

So I'm probably going to have problems if I want to re-play older games like the original Half-Life, SiN, Thief, the original Aliens vs. Predator, Descent 3, Requiem: Avenging Angel, etc? (actually, I never got around to playing Descent 3)

Reply 46 of 51, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Problems are likely. One of the most annoying things though is that every game that only supports 16-bit color depth will suffer from extreme color banding because the modern cards don't dither.

However, the communities for some old games have created hacks to force higher color depth and other things. I know there is such a fix for Dark engine games like Thief and System Shock 2.

Reply 47 of 51, by ratfink

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

One of the most annoying things though is that every game that only supports 16-bit color depth will suffer from extreme color banding because the modern cards don't dither.

Slightly off topic, but what would be the last generations to support older games properly [ie they look right with all the relevant directx or opengl features supported]? On the nvidia side I thought it used to be the fx series, but then I recall the geforce 6 and 7 drivers got fixed for at least some issues. Did it ever go beyond that?

Reply 48 of 51, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Radeon cards are usually not a good choice when it comes to backwards compatibility back to pre-2k games. They are also not a good choice for many OpenGL games because these games were sometimes written like only NV mattered (ie Bioware game engines).

GF6 is a good choice and it supports W9x. I don't know about GF7, although it is very similar to GF6 architecturally.

Reply 49 of 51, by nforce4max

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Radeon cards are usually not a good choice when it comes to backwards compatibility back to pre-2k games. They are also not a good choice for many OpenGL games because these games were sometimes written like only NV mattered (ie Bioware game engines).

GF6 is a good choice and it supports W9x. I don't know about GF7, although it is very similar to GF6 architecturally.

Geforce 6 and 7 are almost entirely the same except there were some minor upgrades as well general improvements with the scaling of the units aside. I liked Geforce 7 for being easy to work with and picture in picture support for editing work but for older games doubtful of the drivers.

Geforce 6 is pretty much a sure thing 😀 I wish that I had bought a 512mb 6800 ultra for my collection. 😢

On a far away planet reading your posts in the year 10,191.

Reply 50 of 51, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

Interesting, I never even knew they made a 512MB version of the 6800 Ultra. I bet that had a pretty hefty price tag on it when it was new... 😮

Reply 51 of 51, by sliderider

User metadata
Rank l33t++
Rank
l33t++
nforce4max wrote:
swaaye wrote:

Radeon cards are usually not a good choice when it comes to backwards compatibility back to pre-2k games. They are also not a good choice for many OpenGL games because these games were sometimes written like only NV mattered (ie Bioware game engines).

GF6 is a good choice and it supports W9x. I don't know about GF7, although it is very similar to GF6 architecturally.

Geforce 6 and 7 are almost entirely the same except there were some minor upgrades as well general improvements with the scaling of the units aside. I liked Geforce 7 for being easy to work with and picture in picture support for editing work but for older games doubtful of the drivers.

Geforce 6 is pretty much a sure thing 😀 I wish that I had bought a 512mb 6800 ultra for my collection. 😢

GeForce 7 doesn't have a Win9x driver, though. Even though the GeForce 7 may only be an incremental improvement on GeForce 6, the lack of the Win9x driver makes a huge difference for some people.

And yeah, the 6800 Ultra with 512mb comes as surprise to me, too. I know some board makers did 512mb versions of the GS and GT after the GeForce 6 was no longer current, but never saw an Ultra with that much memory. I have one of the limited edition 6800 Ultra Extremes from eVGA here, and that even has only 256mb on it.