VOGONS


First post, by Davros

User metadata
Rank l33t
Rank
l33t

You will soon need an active adater if you want to run a crt or a lcd with only dsub
http://newsroom.intel.com/community/intel_new … sing-out-analog

Guardian of the Sacred Five Terabyte's of Gaming Goodness

Reply 1 of 18, by laxdragon

User metadata
Rank Member
Rank
Member

Not a big surprise. With HDCP becoming required for digital content in the future, I'm sure companies want to phase out analog video as soon as they can get away with it.

laxDRAGON.com | My Game Collection | My Computers | YouTube

Reply 3 of 18, by Skyscraper

User metadata
Rank l33t
Rank
l33t

My main rig does not even have a single PCI slot, and its 3 years old 😀

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 4 of 18, by Norton Commander

User metadata
Rank Member
Rank
Member

Not surprised since I had to buy a VGA+audio->HDMI converter to connect my netbook to my new LG LED TV. Majority of current laptops are HDMI only so time to say goodbye to that bulky VGA cable.

I am surprised, however, to find more support in newer displays for composite than other analog inputs.

I guess people still use VCRs?

Reply 5 of 18, by 133MHz

User metadata
Rank Oldbie
Rank
Oldbie

Composite is the king of low-budget devices like "Plug & Play TV Arcade Joystick" game systems, cheap digital video cameras and such. Being the lowest common denominator for video also means that cheapskate manufacturers usually won't include more than a composite video cable even if the device supports higher quality outputs.

http://133FSB.wordpress.com

Reply 6 of 18, by sliderider

User metadata
Rank l33t++
Rank
l33t++

You realize you are linking to an article dated December 8, 2010 as if it were recent news, right? It was always inevitable that the VGA port was going to be phased out because it isn't a secure connection. DVI doesn't have long to live, either, because it isn't secure enough. It is the content providers that are pushing for this as a means to reduce piracy. Microsoft already has the code in place in Windows to disallow pushing high def content out to a recording device through an unsecure connection, the elimination of the connection itself is the next logical step. Everything is going to be HDMI and DisplayPort soon.

Jorpho wrote:

Heh. Remember a couple of years ago when they were saying PCI was being phased out?

Is there really any reason to keep PCI around any longer? Even PCIe 1x has several times the bandwith of PCI. It's time to let it go.

Last edited by sliderider on 2013-10-02, 21:43. Edited 1 time in total.

Reply 7 of 18, by Jorpho

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:

Is there really any reason to keep PCI around any longer?

Not that I know of. But a lot of things seem to persist for far longer than they are useful.

Reply 8 of 18, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

I remember seeing a lot of PCIe only boards but now the PCI slots are back 😀

I still managed to find a mainboard with floppy port as well. So there should be products like that around for a long time.

The TV and video card I got last year also both have VGA. So does the notebook as well.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 9 of 18, by Jepael

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:

DVI doesn't have long to live, either, because it isn't secure enough.

Care to explain why it is not secure enough? DVI has content protection just like HDMI does.

DVI will live in applications (such as medical devices) that do not require audio or content protection because it is licence free (cheaper) and has a more robust connector.

Reply 10 of 18, by sliderider

User metadata
Rank l33t++
Rank
l33t++
Jepael wrote:
sliderider wrote:

DVI doesn't have long to live, either, because it isn't secure enough.

Care to explain why it is not secure enough? DVI has content protection just like HDMI does.

DVI will live in applications (such as medical devices) that do not require audio or content protection because it is licence free (cheaper) and has a more robust connector.

How old is DVI....? It came out in the 90's. Do you really think it's still secure enough for high def content? Another thing, There's only so many ports that they can put on a video card or monitor and being the oldest of those currently in use,as well as being the one that takes up the most space, it is only logical to assume that DVI will be the next to go when the next big thing in monitor interfaces comes out.

Reply 11 of 18, by Jorpho

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:

How old is DVI....? It came out in the 90's. Do you really think it's still secure enough for high def content?

DVI carries precisely the same content as HDMI. The only difference is the pinout. HDMI-to-DVI adapters are passive devices.

That said, given the simplicity and reduced size of an HDMI connector, there's not much reason to use a DVI connector instead – unless one wants to be able to plug in a DVI-to-DSub adapter. (You'll note the OP refers to DVI-I, the variant that specifically allows those adapters to be plugged in.)

Last edited by Jorpho on 2013-10-04, 15:21. Edited 1 time in total.

Reply 12 of 18, by Jepael

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:

How old is DVI....? It came out in the 90's. Do you really think it's still secure enough for high def content?

DVI 1.0 specs were released in April 1999, HDCP protection was started as a DVI add-on in 1999 and HDCP 1.0 released in February 2000. HDMI 1.0 wasn't released until December 2002, as a simple expansion to DVI to include audio and other colorspaces than RGB. The basic stuff has not changed since.

So yes, I do think that HDMI is just as secure as DVI.

I do agree that DVI is not much of use in consumer devices any more. Cheapest ones still have VGA only, no digital connectors.

Reply 13 of 18, by Malik

User metadata
Rank l33t
Rank
l33t

All I can say is that Dsub and DVI are gonna be hot favourite topics in a few years time, right here in Vogons. Just like the beige cases and CRT monitors. 😁

The manufacturers have buried the CRTs, and now are digging the graves for the DVIs, I guess.

5476332566_7480a12517_t.jpgSB Dos Drivers

Reply 14 of 18, by Kamerat

User metadata
Rank Oldbie
Rank
Oldbie

My AMD Radeon R9 290 GPUs are lacking analog outputs, so the article was right about AMD phasing it out in 2013.

DOS Sound Blaster compatibility: PCI sound cards vs. PCI chipsets
YouTube channel

Reply 15 of 18, by JoeCorrado

User metadata
Rank Member
Rank
Member
Skyscraper wrote:

My main rig does not even have a single PCI slot, and its 3 years old 😀

My main rig doesn't have a single pci-e slot... and it is four years old. 😉

-- Regards, Joe

Expect out of life, that which you put into it.

Reply 16 of 18, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie

Latest Radeons already dropped analog video... I was considering getting one until I discovered that. Nvidia cards still have proper DVI-I ports though 😀

PCI slots are becoming less common, and even if you can find a board with one it's most often located right below the first x16 slot.. which is where you'll want your dual-slot graphics card.

Reply 17 of 18, by Maeslin

User metadata
Rank Member
Rank
Member

PCIe --> PCI adapters/extenders are relatively common and cheap though Turns a x1 PCIe slot into 2 PCI slots. However I'm not sure how reliable they are and if there are driver-related oddities.

Reply 18 of 18, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

This isn't surprising. I use DVI and the quality is vastly superior to VGA
With a digital video card it makes sense to use digital straight through instead of Digital->DAC->ADC->Display

That said, I think HDCP is evil to the core