VOGONS

Common searches


First post, by VileR

User metadata
Rank l33t
Rank
l33t

(yes, I know, tl;dr, wall of text... man up or move on) 😀

Since our Intel/AMD/Samsung/Dell/etc. overlords are going to phase out VGA over the next few years, and make way for HDMI/DisplayPort, the thread's title probably isn't the best choice of words. If there was a "war" at all, it's already over, and that's the way things are gonna go down whether we like it or not.

But the question almost asks itself: is there really some overwhelming advantage to digital video over analog? Are we in for a brave new world of heavenly video and celestial media experience and all that, or is it all just marketing noise, with the real intent to ensure that everyone has a nice, obedient device with DRM and content protection built into the interface?

When Joe Blow hears today's magic word "HD", he thinks "well then I'm gonna need HDMI, right?", but VGA is perfectly capable of handling high definition content, and the added fidelity of a digital signal is barely noticeable in most cases, if at all. People fail to realize that digital isn't automatically "better" or more capable than analog - it's a signal, and it degrades. Digital just degrades along a different curve (it's got error protection and more tolerance, but as you increase noise, there's always a point where your signal will just drop off and die a sudden death, whereas analog degrades gradually).

There also seems to be this widespread notion that VGA is just too limited for demanding video applications, because of bandwidth issues, or some impractical limit of resolution / framerate / whatever. Not to put too fine a point on it, but that's bullshit. You aren't going to run into any such bandwidth wall, unless you're trying to lay a VGA cable across the Atlantic or something. There's a limit out there, but nobody's going to hit it through practical use any time soon.

  • I've never taken a really hard look at DVI, even though I use it myself for my dual monitor setup, only because it has been common for video cards to come with one VGA port and one DVI port (and that's the extent of the use most people seem to give it). And guess what; there's absolutely no discernible difference in picture quality between the two monitors; whether DVI is transmitting pure digital, or analog DVI-A (through a DVI-to-VGA adapter).

    The only difference seems to be this: over analog VGA, my Geforce GT240 can send out any refresh rate I'd practically want; as high as my LCD panel can handle. But over digital DVI (with the same card, same drivers, and the same monitor in the same resolution), the "superior" interface only offers me a single choice of... wait for it... a blistering, face-melting 60 Hz! Ain't I just basking in the advantage of digital?
  • As for HDMI, I've never used it myself. But when my friend connected his brand-new HDMI-capable video card to his brand-new HDTV (through his brand-new HDMI-audio system), I could swear the picture was worse than on his VGA-connected PC monitor at the same resolution. And yes, we've tried to optimize every setting; there just seems to be some form of crappy filtering happening that cannot be disabled. I'm pretty sure I've read very similar reports from other people on these forums, so I know I wasn't just imagining things.

    Sure, having both video and audio over a single interface is nice, for the average user anyway. But the advantages don't seem to amount to much, and having built-in HDCP is already a huge disadvantage that dwarfs them all (I won't bother getting into why I feel that way, but you either get it or you don't... and yes - it's just going too far).
  • Then there's DisplayPort, and I have a grand total of zero experience with it. Any thoughts from those of you who use it? The features look great on paper, but looky here... this one blesses us with both DPCP and HDCP! Do we deserve our good fortune?!

Yeah, this has been a bit of a rant, and probably not the most informed one in the world. Just feel free to educate me and share your thoughts...

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 1 of 29, by MaxWar

User metadata
Rank Oldbie
Rank
Oldbie

Ok, here's my two cents ...

1- as far as LCD displays are concerned, i like DVI. I do notice the sharper image/more accurate colors/less noise vs VGA.

2- To my actual knowledge, DVI also contain the analog ( VGA ) component, this is what allows us to use the DVI/VGA adapter. As long as this continue to be the case i am happy with my vid card only having dvi outputs on it. As for the 60hz only thing you talk about, I dont know. It sounds bad the way you put it but i would be curious as if it is not only the digital connection somehow guessing its connected to a digital display and matching the refresh rate to the one of the screen, almost all LCD wont do more than 60hz. The test to do would be like: connect my geforce 460 with vga adapter on a CRT display and try to set it to whatever RR you want.

3. HDMI ... now, anybody with an ounce of common sense will acknowledge the ONLY reason hdmi exists is HDCP... Otherwise HDMI is simply DVI + SPDIF - DVI-A . The utility of having all the signals bundled together in questionable . I personally prefer them separated, so i can send video to my screen and sound to my sound system. The only reason for HDMI is content control. It is another attempt from the media corporations at controlling what you are doing. I personally hate HDMI and will avoid it at all costs. It also has possible input lag, likely du to the HDCP decoder. Its a load of crap.

Reply 2 of 29, by VileR

User metadata
Rank l33t
Rank
l33t

yeah, "analog DVI" is what I referred to as "DVI-A". They seem to want to phase out DVI as well, and there are already different pinout configurations for DVI, some of which lack the analog pins - at least judging by wikipedia...

about the refresh rate issue: luckily my LCDs support 70Hz, and handle it very nicely, but only over analog VGA. Tested with Supaplex / other smooth-scrolling VGA games on DOSBox.
Over DVI, the display is indeed detected as digital, but the detected monitor model is the same; the weird 60Hz limitation persists, even if I uncheck "hide modes that this monitor cannot display" from XP's display control. So it doesn't look like 60Hz is due to any LCD limitation.

HDMI - I'm tending to agree, and I forgot all about that input lag issue too. I wouldn't personally want all the signals on one cable either (that's why I wrote it could be nice "for the average user") 😀
Glad to see some agreement about the whole content-protection debacle with HDMI. Personally I plan to avoid it like the plague as long as it's still possible.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 3 of 29, by MaxWar

User metadata
Rank Oldbie
Rank
Oldbie

Im pretty confident most here will agree about the HDMI issue ,

To continue on DVI Refresh Rate subject, there is also the fact that DVI, unlike VGA, has precise limits to the amount of data it can transmit. I think the maximum data a single link DVI can carry is 1920x1200x32bit @ 60hz.
Anything more and you need Dual link DVI setup ( vid card + cable + display) . Then you have twice as much bandwidth. Maybe the single link dvi is the reason of you RR cap ? Worth investigating. On my part however i know my monitor is capped at 60hz. I dont mind, i cant see it blink like a CRT 😜

Reply 4 of 29, by RogueTrip2012

User metadata
Rank Oldbie
Rank
Oldbie

I just got my first LCD monitor last week. I quickly jumped ship to DVI on it to get less noise distortion, I can especially notice watching videos (or cartoons) with less color palettes.

The nice thing about Digital is that content going into the computer is digital, why spend more time with DACs and ADCs getting data from point to point. They been trying to drop analog from Audio processing also obviously. This leads to less reproduction issues.

I love my 4:3 and CRTs as much as the next person but the tech has been dropped for Hollywoods inferior demands.

The LCD RR isn't a big issue like CRT's as it doesn't have guns re-drawing a screen constantly. I actually hate LCD TVs using 120Hz/240Hz as it causes judder and jerky panning on the screen. I always turn it off for others just to show them the how bad it really is.

EDIT: I guess the point is that VGA is analog signal. They aren't making Analog sets in the masses (CRT's) anymore. Everything has been Digital displays and only makes sense to support digital output and as I was saying has less reproduction issue and also reduced input lag.

HDMI is a discussion of its own.

> W98SE . P3 1.4S . 512MB . Q.FX3K . SB Live! . 64GB SSD
>WXP/W8.1 . AMD 960T . 8GB . GTX285 . SB X-Fi . 128GB SSD
> Win XI . i7 12700k . 32GB . GTX1070TI . 512GB NVME

Reply 5 of 29, by VileR

User metadata
Rank l33t
Rank
l33t

I don't think I'm hitting the DVI bandwidth limit... I'm set to 1280x1024x32bit - native resolution (these are somewhat crappy monitors too, TN rather than IPS). When both monitors are connected in analog (VGA or DVI-A - aka VGA-over-DVI) I can set both to 70Hz without a hitch.

it's true that LCD @ 60Hz isn't a problem like CRT @ 60Hz, since there's no flicker. However, I do like to play my old DOS games and still get smooth scrolling, so I need more than that. I'm sure that supporting ancient, rotting games is the last thing that hardware makers care about, but being limited to a single refresh rate isn't an advancement, it's a regression.

It's a damn shame that SED and FED display technologies don't seem to be going forward - apparently they combine the respective advantages of CRTs and LCDs... that would be a different topic, though.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 6 of 29, by MaxWar

User metadata
Rank Oldbie
Rank
Oldbie
VileRancour wrote:

I'm sure that supporting ancient, rotting games is the last thing that hardware makers care about, but being limited to a single refresh rate isn't an advancement, it's a regression.

But it could also be a step toward CGA compatibility! 🤣

Reply 8 of 29, by MaxWar

User metadata
Rank Oldbie
Rank
Oldbie

Chuck Norris can run his Hercules @ 60hz.

Ok, sry about that, just could not resist 🤣

Btw, are you using a specific drivers for you 70hz LCD ? I just had a thought that since most LCD cant run more than 60 hz it might actually not be a good thing to try to force higher Refresh Rate up their throat, Hence your generic display driver/windows will prevent you from doing so.

I am also wondering about standard VGA in dos, it usually sets itself up at 70hz. Could be interesting to verify this. Maybe use the CGA tester vertical retrace test ? 😁

Reply 9 of 29, by WolverineDK

User metadata
Rank Oldbie
Rank
Oldbie

Also the control over the HDCP protocol has already been lost. Since the HDCP protocol has been cracked by crafty crackers. So fuck the powers at be, since they have gained nothing by to control and lost everything in their surge to power.

Reply 10 of 29, by BigBodZod

User metadata
Rank Oldbie
Rank
Oldbie

But you hit the nail on the head with HDMI (high defanition media interface) in that it was designed specifically for the movie studios in that they could supposedly control the devices that could be connected to this interface and hence control which devices could playback said media, such as movies on BluRay formatted discs.

The interface cable can supply a huge bandwidth and sends both video and audio data at the same time on the same cable.

As for me I do prefer looking at my PC via a DVI to HDMI cable on my Samsung HDTV.

Since I have one of the nicer 120Hz panels I have never seen Dragon Age look so great 😀

No matter where you go, there you are...

Reply 11 of 29, by gulikoza

User metadata
Rank Oldbie
Rank
Oldbie

According to one thread DVI itself is capable of higher refresh rates. But it has to be supported by the monitor (that is it has to be supported on the DVI interface 😀).

Otherwise I'm all for digital. Yes, the standards could be better (somehow we PAL folks just got upped to 60Hz for no apparent reason and lost teletext support), more compatible and with less DRM crap. Dport is (electically) incompatible with DVI and HDMI and requires active and expensive adapters...but the picture really is that much better compared to VGA. I could very clearly see VGA signal degradation on my old 19" CRT I spent quite some time (& money) buying expensive and good cabling which proved necessary for getting good and clear picture. And that was only 1280x960@85. I actually connected my retro rig to a VGA input on my current 24" Samsung LCD and the picture was all crap. Sometimes I get (job related) calls of people having bad picture on their crappy office LCDs. If they happen to have a DVI input on the monitor and on the card, a simple cable swap will almost always solve all the problems.

http://www.si-gamer.net/gulikoza

Reply 12 of 29, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie

I prefer VGA over everything else. It is analog RGBHV, but in one cable. Supports just about any resolution and refresh rate you will ever need.
DVI is technically better, and backwards compatible with VGA. But mostly only LCD panels have a DVI connector so I never use it.

I really don't understand all the fuss about HDMI. Why would you want audio signals going to your TV?? Why would you want someone else to decide what content you can and cannot stream to which equipment? Why would you want to use a tiny flimsy mobile phone connector without any locking mechanism? (same with sata, I HATE those things)

Thanks to all this HD crap some odd number with a P or I behind it is a measurement of video quality now (see youtube). I don't get it, I have a 1200p screen yet I can't watch 'HD' movies at 1080p??

Reply 14 of 29, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie

Yes I'm aware of that, but what I don't understand is how this '1080p' can be a measurement of image quality. All it tells you is the vertical resolution and scanning method, which say nothing at all about video quality.

Reply 15 of 29, by MaxWar

User metadata
Rank Oldbie
Rank
Oldbie

Its a bit like megapixels for digital cameras, it means nothing without good optics.

Anyway i dont uneratand why you said : "I have a 1200p screen yet I can't watch 'HD' movies at 1080p?? "
I have a 1200p display, you just end up with narrow black bars top/bottom. Unless you do some kind of stretching/upscaling , which i would not do .

Reply 16 of 29, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie
MaxWar wrote:

Its a bit like megapixels for digital cameras, it means nothing without good optics.

Anyway i dont uneratand why you said : "I have a 1200p screen yet I can't watch 'HD' movies at 1080p?? "
I have a 1200p display, you just end up with narrow black bars top/bottom. Unless you do some kind of stretching/upscaling , which i would not do.

What I was trying to say is, the 1080p number tells you nothing about the horizontal resolution. As if that's not important.

My screen is 1600x1200, and it does progressive scanning, so in 'HD terms' it would classify as 1200p. But 1080p won't fit 😉

Reply 17 of 29, by Sune Salminen

User metadata
Rank Member
Rank
Member

RR cap is a monitor limitation.

My monitor (Samsung 943BX) can run 1280x1024 at 75Hz via DVI on both Windows and OS X.

One thing that sucks about DVI is the lack of VESA modes. On VGA you can go all the way to the max res of your monitor. On many cards, when using the DVI port the highest available VESA resolution is 1280x1024. This limitation goes away when using a DVI-VGA adapter or VGA port with the same video card.

Reply 18 of 29, by VileR

User metadata
Rank l33t
Rank
l33t

Kinda weird for a monitor to have a RR cap in DVI, even though it handles higher rates in analog, but then again, my panels are a pair of old LGs from back when "digital" wasn't such a big deal yet, so I guess they just didn't put much thought into that part of the design.

@Sune: not sure what you mean by mentioning "VESA" specifically (thought that this didn't have much meaning these days, when the video drivers worry about standards, or lack thereof).
But since we're talking about digital panels and a digital interface, the only resolution worth considering is your monitor's native resolution. (Yet another limitation, when compared to analog CRT technology, but that's a different story to get into). So if a card cannot support your digital monitor's native resolution in digital, well... let's just be nice and call it useless.

All this talk of 1080p and widescreen reminds me: the whole widescreen thing seems to be kind of a ripoff too, in a way.

CRT was always limited to 4:3 or something very close, since the design has limitations, and it's tough/expensive to produce anything wider. But when CRTs were on the way out, flat panel manufacturers realized that they don't have to stick to old standards anymore... and that a widescreen monitor gives you less screen area than a 4:3 monitor with the same diagonal measurement! (do the math).
So looking at it that way, it's simply yet another opportunity for hardware manufacturers to short-change the customer. Why not just call it "shortscreen" (referring to the vertical) rather than "wide" ;)

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 19 of 29, by swaaye

User metadata
Rank l33t++
Rank
l33t++

HDMI problems on TVs are related to TVs doing post processing on the image. Sometimes (most of the time maybe) you can't disable all of it and that's why it looks terrible compared to straight VGA which is intended for a PC connection. The TV manufacturers assume you are going to be connecting some sort of video playback device to their HDMI and they add that post processing in order to fool people into thinking their TV has magical image enhancing stuff. This is also where input lag comes from.

DVI and HDMI are compatible standards if you ignore the audio part so there's nothing inherently wrong with HDMI.

Widescreen monitors are great for movies, and TV shows for about the past decade. That's why they are popular. 4:3 is not so hot anymore for that unless you like your old school TV shows that are 4:3. The price of monitors is pretty much cheaper than ever outside of really high quality panels so I'm not sure there's any ripping off going on.

Pumping VGA into a digital monitor like a LCD means the image goes through 2 extra bouts of processing. The video card pumps it through its DAC and the LCD runs it through its ADC to get the digital signal it wants. Not so great. You also have to deal with the LCD trying to auto-calibrate to the analog signal because every video card puts out its own signal and it's very unlikely to be as perfectly sharp as the digital signal.

BTW, this web page is really useful to get a LCD to calibrate VGA as well as possible.
http://www.techmind.org/lcd/phasing.html