Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X8xx with VGA+DVI show boot as DVI?

Discussion about old graphics cards, monitors and video related things.

Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X8xx with VGA+DVI show boot as DVI?

Postby ruthan » 2019-3-16 @ 16:16

Hello,
what i find out as annoying is that some old graphics card have for boot as primary video output sub-d even when they have DVI? Its here some solution for that, some Bios Mods etc?
Last edited by ruthan on 2019-4-11 @ 20:28, edited 3 times in total.
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Re: Old videocards change primary video output from SUD-D to DVI?

Postby agent_x007 » 2019-3-16 @ 18:56

DVI-I to VGA adapter ?
Image
User avatar
agent_x007
Oldbie
 
Posts: 1115
Joined: 2016-1-19 @ 11:06

Re: Old videocards change primary video output from SUD-D to DVI?

Postby ruthan » 2019-3-16 @ 19:33

agent_x007 wrote:DVI-I to VGA adapter ?

I need exactly the opposite.. VGA to DVI need some active expensive adapters and even that picture would be worse than pure digital.
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Re: Old videocards change primary video output from SUD-D to DVI?

Postby Ozzuneoj » 2019-3-16 @ 20:50

I would recommend checking the small components around the DVI port. I had a couple of cards that appeared to do this but were actually only doing this because they were damaged. I think the inductors (usually small black rectangles) and capacitors near the outputs are very susceptible to damage because they get chipped off by the back plates of other cards if they are stacked\heaped in a box unsafely.
Time Machine = FIC PA-2013 2.1 - K6-2 500 - 256MB PC-100 - TNT2 Pro 16MB AGP - Labway Yamaha YMF719-E - Midiman MM401
Amibay For Sale Threads
I have lots of PC stuff for sale on Mercari! Get $10 off your first purchase with my invite link!
User avatar
Ozzuneoj
Oldbie
 
Posts: 1530
Joined: 2016-3-16 @ 21:33

Re: Old videocards change primary video output from SUD-D to DVI?

Postby ruthan » 2019-3-16 @ 22:00

So you are claiming that there is nothing like D-SUB as primary output by design and that is only my wrong assumption?

I always that that is feature by design, at least my newer cards with zillion of outputs in need some time to discover which output can be used for bios screen and if more ouputs are connected which is primary for bios. I dont thing that is even documented feature or something like that even on newer cards, some where sending picture at boot moment to all / multiple outputs some dont.
I dont care about it too much if outputs are digital, you can convert one to other, but with analog its problem.. I also tough that is vendor relative.

I mainly have this problem with Radeon X300-X800 card, Geforce 6 cards, Radeon X1300 etc.. Even Geforce 6200 and 6600 are behave differently, there could be also some depencency on monitor vendors i have HP 2475W nad ZR24W and Dell2007FP screens.. I know that sometime is problem with HDMI switch, but when i suspect such situation im trying direct connecion.. and sometime even that is not working..

Does this theory suppose that is only something wrong for boot with card, or it means that DVI will not work at all? Because i cant activate second output from modern OS, i always consider card as deffective, but i my head not working output for boot is something different - i could be wrong.

Im not saying that i can be result of card damage, i never had brand new Geforce 5/6/7 or Radeon Xxxx card..
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Re: Old videocards change primary video output from SUD-D to DVI?

Postby havli » 2019-3-16 @ 22:50

I'm not sure if I understand what your problem is.

You have two monitors connected (one DVI, one VGA) and you only see BIOS on the VGA output? That is most likely by design and you can't change it. Also it is possible different cards have different priorities of outputs. For example on modern GeForce cards DP has lower priority that DVI or HDMI. So in my case I can only see BIOS on DVI monitor and my DP monitor wakes up after windows boot.
Radeons on the other hand clone image to all outputs untill OS boots. Some of them at least.

If you have just one screen connected, the of course it should display BIOS and everything no matter how it is connected.
HW museum.cz - my collection of PC hardware
User avatar
havli
Oldbie
 
Posts: 771
Joined: 2014-11-07 @ 16:51
Location: Czech Republic

Re: Old videocards change primary video output from SUD-D to DVI?

Postby ruthan » 2019-3-17 @ 00:42

More displays is other problem. I have cards with D-SUB 15 pin and DVI, when i connect monitor only to DVI a get not picture at all, when i connect it to VGA i get it.. So i always thought that some card have D-SUB(VGA) as default and only connection when only 1 monitor is connected..
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI?

Postby havli » 2019-3-17 @ 09:30

Well, if only one monitor is connected, then the card should autodetect where and sent signal to that port. And from what I remember it worked for me most of the time. Both GF3 Ti and GF4 Ti worked and most of other cards too. But I remember one GF 6800 GS which didn't work with DVI monitor at all. I consider that a defect, not feature.
HW museum.cz - my collection of PC hardware
User avatar
havli
Oldbie
 
Posts: 771
Joined: 2014-11-07 @ 16:51
Location: Czech Republic

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI?

Postby Tronix » 2019-3-17 @ 11:10

It depends on what you mean by "old video cards". For example, many 486-P1 era videocards have Feature Connector . I don't know enough technical details, except Pinout, but it is possible that digital signals R,G,B,Hsync,Vsync go to it.
User avatar
Tronix
Newbie
 
Posts: 56
Joined: 2015-4-26 @ 13:39
Location: Moscow, Russia

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI?

Postby ruthan » 2019-3-17 @ 12:59

I only care about Geforce 5 / Radeon 9200 and more modern.. these are if im not wrong first cards with DVI, Matrox had something too, 3dfx i never saw Vooodo 4/5 with digital output, Voodoo 3 was still analog for sure.
I looked for some Geforce 6 card, i saw that some manufactors have 2x DVI on Geforce 6600- so such problem, but most ones have DVI+VGA, Geforce 79xx has 2 DVI too, but there some low end cards like Geforce 7300 which still have analog + DVI. Radeon X8xx i only have cards with Analog + DVI, im not sure if there were 2x Digital variants? There 2x DVI is solution of my problem too.. but i would need replace card, if this is not Bios related problem.

True is i never too much tested dual monitor setup on such cards, so if DVI not worked for boot, i used analog, but i not sure if DVI worked at all, i will test it. I have for example Geforce 5600 which are booting with DVI but picture is broken - some additional lines and in WIndows part of picture is missing (so its defective for sure), but analog is fine.. but other cards, not give me DVI picture at boot at all -that is problem.
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI?

Postby ruthan » 2019-4-07 @ 12:09

I have tested other X700 /X800 cards with VGA+DVI, various manufactors, i really thing that my theory is right - i never got boot picture on DVI port. Can someone confirm that its working for him? It could be some incompatibility with my monitors, i dont beleive that all these cards have dead DVI ports, too many cards.

I know that there are some high end Radeons with 2x DVI, but its is only workaround..
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X800 with VGA+DVI show boot as DVI?

Postby Tiido » 2019-4-07 @ 21:15

It depends on the particular BIOS used on the video card. Some of mine show same image on both outputs, some only show output on only one of the outputs and on some VGA is output is fixed resolution with scaling to attached display "native resolution" making things look blurry. Nearly all my ATI cards show image on both outputs while nVidia ones generally only show on one.
User avatar
Tiido
Oldbie
 
Posts: 818
Joined: 2018-1-14 @ 04:40
Location: Estonia

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI?

Postby spiroyster » 2019-4-08 @ 09:44

ruthan wrote:I have tested other X700 /X800 cards with VGA+DVI, various manufactors, i really thing that my theory is right - i never got boot picture on DVI port.

If this was the case, no one be able to access the bios unless they had a VGA plugged in? I haven't used VGA on a modern setup for a while, and can't remember not being able to access the bios?
User avatar
spiroyster
Oldbie
 
Posts: 508
Joined: 2015-10-12 @ 12:26

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X800 with VGA+DVI show boot as DVI?

Postby swaaye » 2019-4-08 @ 17:51

X800 definitely outputs DVI on boot if it's connected. These single-link DVI cards can be picky about monitors though. There were compliance problems with the upper limits like 1920x1080/1200. Sometimes you don't get an image until the display driver loads. Seems like the display driver is much more capable of negotiating with the monitor than the card's firmware is.
User avatar
swaaye
Moderator
 
Posts: 7417
Joined: 2002-7-22 @ 21:24
Location: WI, USA

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X800 with VGA+DVI show boot as DVI?

Postby havli » 2019-4-08 @ 18:48

Btw - X800 doesn't work with HDMI monitors (using DVI->HDMI adapter). But this is not related to this issue.
HW museum.cz - my collection of PC hardware
User avatar
havli
Oldbie
 
Posts: 771
Joined: 2014-11-07 @ 16:51
Location: Czech Republic

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X800 with VGA+DVI show boot as DVI?

Postby ruthan » 2019-4-08 @ 22:28

havli wrote:Btw - X800 doesn't work with HDMI monitors (using DVI->HDMI adapter). But this is not related to this issue.

For one of my monitoring im using DVI to HDMI cable to HDMI switch from which im using HDMI to DVI cable - to DELL 1600x1200 (im using 1 monitor for multiple testbenches / machines), but second is direct DVI connection to 24" HP 1920x1200 monitor. I can test direct connection for first monitor, but i doubt that it will work, i would at least expect some out of range signal message or but its completly ignored when its connected to X800..
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X800 with VGA+DVI show boot as DVI?

Postby swaaye » 2019-4-08 @ 23:12

I tested an X800XT AGP which I have this evening and it does output DVI and VGA at same time. I was able to see POST and access the BIOS. Both outputs ran at the same resolution so that must be linked until the driver loads.

HDMI could be problematic. You may need to use a DVI EDID emulator to hide the HDMI information from the Radeon. I have had success with this setup.
User avatar
swaaye
Moderator
 
Posts: 7417
Joined: 2002-7-22 @ 21:24
Location: WI, USA

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X800 with VGA+DVI show boot as DVI?

Postby ruthan » 2019-4-09 @ 18:24

Thanks, which brand is your card?
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X800 with VGA+DVI show boot as DVI?

Postby NJRoadfan » 2019-4-09 @ 18:50

My ATI All-in-Wonder X800XL PCIe boots off DVI-D by default no problem. Right now its connected to a HDTV via a DVI-D to HDMI adapter.
NJRoadfan
Oldbie
 
Posts: 940
Joined: 2012-5-26 @ 03:54
Location: Northern NJ

Re: Old videocards change primary video output from D-SUB (VGA-15 pin) to DVI, can X800 with VGA+DVI show boot as DVI?

Postby ruthan » 2019-4-09 @ 19:25

BTW for which OSes are using these cards X8xx cards, Win7 is unsupported if im not wrong, so only Windows 98 and XP options are left. I would say that at least for older XP games its too slow.. so are using it for Win98?
In other thread some people said that Win98 compatibility of these cards sucks, especially in OpenGL im not sure about that..
Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough HW.
User avatar
ruthan
Oldbie
 
Posts: 987
Joined: 2013-3-07 @ 04:01
Location: Schwarz Wald-from France to Ukraine, from Denmark to Austria. Celts+German+Slavs melting pot.

Next

Return to Video

Who is online

Users browsing this forum: No registered users and 4 guests