First post, by VileR
- Rank
- l33t
(yes, I know, tl;dr, wall of text... man up or move on) 😀
Since our Intel/AMD/Samsung/Dell/etc. overlords are going to phase out VGA over the next few years, and make way for HDMI/DisplayPort, the thread's title probably isn't the best choice of words. If there was a "war" at all, it's already over, and that's the way things are gonna go down whether we like it or not.
But the question almost asks itself: is there really some overwhelming advantage to digital video over analog? Are we in for a brave new world of heavenly video and celestial media experience and all that, or is it all just marketing noise, with the real intent to ensure that everyone has a nice, obedient device with DRM and content protection built into the interface?
When Joe Blow hears today's magic word "HD", he thinks "well then I'm gonna need HDMI, right?", but VGA is perfectly capable of handling high definition content, and the added fidelity of a digital signal is barely noticeable in most cases, if at all. People fail to realize that digital isn't automatically "better" or more capable than analog - it's a signal, and it degrades. Digital just degrades along a different curve (it's got error protection and more tolerance, but as you increase noise, there's always a point where your signal will just drop off and die a sudden death, whereas analog degrades gradually).
There also seems to be this widespread notion that VGA is just too limited for demanding video applications, because of bandwidth issues, or some impractical limit of resolution / framerate / whatever. Not to put too fine a point on it, but that's bullshit. You aren't going to run into any such bandwidth wall, unless you're trying to lay a VGA cable across the Atlantic or something. There's a limit out there, but nobody's going to hit it through practical use any time soon.
- I've never taken a really hard look at DVI, even though I use it myself for my dual monitor setup, only because it has been common for video cards to come with one VGA port and one DVI port (and that's the extent of the use most people seem to give it). And guess what; there's absolutely no discernible difference in picture quality between the two monitors; whether DVI is transmitting pure digital, or analog DVI-A (through a DVI-to-VGA adapter).
The only difference seems to be this: over analog VGA, my Geforce GT240 can send out any refresh rate I'd practically want; as high as my LCD panel can handle. But over digital DVI (with the same card, same drivers, and the same monitor in the same resolution), the "superior" interface only offers me a single choice of... wait for it... a blistering, face-melting 60 Hz! Ain't I just basking in the advantage of digital? - As for HDMI, I've never used it myself. But when my friend connected his brand-new HDMI-capable video card to his brand-new HDTV (through his brand-new HDMI-audio system), I could swear the picture was worse than on his VGA-connected PC monitor at the same resolution. And yes, we've tried to optimize every setting; there just seems to be some form of crappy filtering happening that cannot be disabled. I'm pretty sure I've read very similar reports from other people on these forums, so I know I wasn't just imagining things.
Sure, having both video and audio over a single interface is nice, for the average user anyway. But the advantages don't seem to amount to much, and having built-in HDCP is already a huge disadvantage that dwarfs them all (I won't bother getting into why I feel that way, but you either get it or you don't... and yes - it's just going too far). - Then there's DisplayPort, and I have a grand total of zero experience with it. Any thoughts from those of you who use it? The features look great on paper, but looky here... this one blesses us with both DPCP and HDCP! Do we deserve our good fortune?!
Yeah, this has been a bit of a rant, and probably not the most informed one in the world. Just feel free to educate me and share your thoughts...