ratfink wrote:Is this just the 2d signal quality being comparatively poor om nvidias or is it because i'm using a vga rather than dvi connection or some such.
If you are using DVI-D (as in the digital variation, not using a DVI-to-VGA converter), then you won't have this problem.
It's a digital interface, so the quality of the cables and output circuit do not correspond directly to image quality. If the signal is bad, then the monitor will just have trouble synchronizing to it, and you may experience dropouts in the image.
If you have to use VGA for some reason, then you could indeed try using a DVI-to-VGA converter. I've had some cards with multiple outputs, where one output had more problems than the other.
The problem of blurring/ghosting tends to get worse as the signal frequency goes up. So you may want to try lower resolutions and/or lower refresh rates to see if that improves the situation.
On older cards, it was possible to remove part of the low-pass filter (just removing a bunch of resistors and capacitors basically). This would greatly improve the sharpness on bad cards (the problem was usually that the filter was poorly designed and/or poor quality parts were used in constructing the filter).
But I think your card is too new for that. At some point, GPU vendors started integrating the output circuitry, including the filter, into the GPU itself, so that third-party board manufacturers could no longer mess up the image quality and give them a bad name.