First post, by Davros
- Rank
- l33t
You will soon need an active adater if you want to run a crt or a lcd with only dsub
http://newsroom.intel.com/community/intel_new … sing-out-analog
Guardian of the Sacred Five Terabyte's of Gaming Goodness
You will soon need an active adater if you want to run a crt or a lcd with only dsub
http://newsroom.intel.com/community/intel_new … sing-out-analog
Guardian of the Sacred Five Terabyte's of Gaming Goodness
Not a big surprise. With HDCP becoming required for digital content in the future, I'm sure companies want to phase out analog video as soon as they can get away with it.
Heh. Remember a couple of years ago when they were saying PCI was being phased out?
My main rig does not even have a single PCI slot, and its 3 years old 😀
New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.
Not surprised since I had to buy a VGA+audio->HDMI converter to connect my netbook to my new LG LED TV. Majority of current laptops are HDMI only so time to say goodbye to that bulky VGA cable.
I am surprised, however, to find more support in newer displays for composite than other analog inputs.
I guess people still use VCRs?
Composite is the king of low-budget devices like "Plug & Play TV Arcade Joystick" game systems, cheap digital video cameras and such. Being the lowest common denominator for video also means that cheapskate manufacturers usually won't include more than a composite video cable even if the device supports higher quality outputs.
You realize you are linking to an article dated December 8, 2010 as if it were recent news, right? It was always inevitable that the VGA port was going to be phased out because it isn't a secure connection. DVI doesn't have long to live, either, because it isn't secure enough. It is the content providers that are pushing for this as a means to reduce piracy. Microsoft already has the code in place in Windows to disallow pushing high def content out to a recording device through an unsecure connection, the elimination of the connection itself is the next logical step. Everything is going to be HDMI and DisplayPort soon.
wrote:Heh. Remember a couple of years ago when they were saying PCI was being phased out?
Is there really any reason to keep PCI around any longer? Even PCIe 1x has several times the bandwith of PCI. It's time to let it go.
wrote:Is there really any reason to keep PCI around any longer?
Not that I know of. But a lot of things seem to persist for far longer than they are useful.
I remember seeing a lot of PCIe only boards but now the PCI slots are back 😀
I still managed to find a mainboard with floppy port as well. So there should be products like that around for a long time.
The TV and video card I got last year also both have VGA. So does the notebook as well.
wrote:DVI doesn't have long to live, either, because it isn't secure enough.
Care to explain why it is not secure enough? DVI has content protection just like HDMI does.
DVI will live in applications (such as medical devices) that do not require audio or content protection because it is licence free (cheaper) and has a more robust connector.
wrote:wrote:DVI doesn't have long to live, either, because it isn't secure enough.
Care to explain why it is not secure enough? DVI has content protection just like HDMI does.
DVI will live in applications (such as medical devices) that do not require audio or content protection because it is licence free (cheaper) and has a more robust connector.
How old is DVI....? It came out in the 90's. Do you really think it's still secure enough for high def content? Another thing, There's only so many ports that they can put on a video card or monitor and being the oldest of those currently in use,as well as being the one that takes up the most space, it is only logical to assume that DVI will be the next to go when the next big thing in monitor interfaces comes out.
wrote:How old is DVI....? It came out in the 90's. Do you really think it's still secure enough for high def content?
DVI carries precisely the same content as HDMI. The only difference is the pinout. HDMI-to-DVI adapters are passive devices.
That said, given the simplicity and reduced size of an HDMI connector, there's not much reason to use a DVI connector instead – unless one wants to be able to plug in a DVI-to-DSub adapter. (You'll note the OP refers to DVI-I, the variant that specifically allows those adapters to be plugged in.)
wrote:How old is DVI....? It came out in the 90's. Do you really think it's still secure enough for high def content?
DVI 1.0 specs were released in April 1999, HDCP protection was started as a DVI add-on in 1999 and HDCP 1.0 released in February 2000. HDMI 1.0 wasn't released until December 2002, as a simple expansion to DVI to include audio and other colorspaces than RGB. The basic stuff has not changed since.
So yes, I do think that HDMI is just as secure as DVI.
I do agree that DVI is not much of use in consumer devices any more. Cheapest ones still have VGA only, no digital connectors.
All I can say is that Dsub and DVI are gonna be hot favourite topics in a few years time, right here in Vogons. Just like the beige cases and CRT monitors. 😁
The manufacturers have buried the CRTs, and now are digging the graves for the DVIs, I guess.
My AMD Radeon R9 290 GPUs are lacking analog outputs, so the article was right about AMD phasing it out in 2013.
wrote:My main rig does not even have a single PCI slot, and its 3 years old 😀
My main rig doesn't have a single pci-e slot... and it is four years old. 😉
-- Regards, Joe
Expect out of life, that which you put into it.
Latest Radeons already dropped analog video... I was considering getting one until I discovered that. Nvidia cards still have proper DVI-I ports though 😀
PCI slots are becoming less common, and even if you can find a board with one it's most often located right below the first x16 slot.. which is where you'll want your dual-slot graphics card.
PCIe --> PCI adapters/extenders are relatively common and cheap though Turns a x1 PCIe slot into 2 PCI slots. However I'm not sure how reliable they are and if there are driver-related oddities.
This isn't surprising. I use DVI and the quality is vastly superior to VGA
With a digital video card it makes sense to use digital straight through instead of Digital->DAC->ADC->Display
That said, I think HDCP is evil to the core