VOGONS


First post, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

Hello,
what i find out as annoying is that some old graphics card have for boot as primary video output sub-d even when they have DVI? Its here some solution for that, some Bios Mods etc?

Last edited by ruthan on 2019-04-11, 20:28. Edited 3 times in total.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 2 of 28, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie
agent_x007 wrote:

DVI-I to VGA adapter ?

I need exactly the opposite.. VGA to DVI need some active expensive adapters and even that picture would be worse than pure digital.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 3 of 28, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

I would recommend checking the small components around the DVI port. I had a couple of cards that appeared to do this but were actually only doing this because they were damaged. I think the inductors (usually small black rectangles) and capacitors near the outputs are very susceptible to damage because they get chipped off by the back plates of other cards if they are stacked\heaped in a box unsafely.

Now for some blitting from the back buffer.

Reply 4 of 28, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

So you are claiming that there is nothing like D-SUB as primary output by design and that is only my wrong assumption?

I always that that is feature by design, at least my newer cards with zillion of outputs in need some time to discover which output can be used for bios screen and if more ouputs are connected which is primary for bios. I dont thing that is even documented feature or something like that even on newer cards, some where sending picture at boot moment to all / multiple outputs some dont.
I dont care about it too much if outputs are digital, you can convert one to other, but with analog its problem.. I also tough that is vendor relative.

I mainly have this problem with Radeon X300-X800 card, Geforce 6 cards, Radeon X1300 etc.. Even Geforce 6200 and 6600 are behave differently, there could be also some depencency on monitor vendors i have HP 2475W nad ZR24W and Dell2007FP screens.. I know that sometime is problem with HDMI switch, but when i suspect such situation im trying direct connecion.. and sometime even that is not working..

Does this theory suppose that is only something wrong for boot with card, or it means that DVI will not work at all? Because i cant activate second output from modern OS, i always consider card as deffective, but i my head not working output for boot is something different - i could be wrong.

Im not saying that i can be result of card damage, i never had brand new Geforce 5/6/7 or Radeon Xxxx card..

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 5 of 28, by havli

User metadata
Rank Oldbie
Rank
Oldbie

I'm not sure if I understand what your problem is.

You have two monitors connected (one DVI, one VGA) and you only see BIOS on the VGA output? That is most likely by design and you can't change it. Also it is possible different cards have different priorities of outputs. For example on modern GeForce cards DP has lower priority that DVI or HDMI. So in my case I can only see BIOS on DVI monitor and my DP monitor wakes up after windows boot.
Radeons on the other hand clone image to all outputs untill OS boots. Some of them at least.

If you have just one screen connected, the of course it should display BIOS and everything no matter how it is connected.

HW museum.cz - my collection of PC hardware

Reply 6 of 28, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

More displays is other problem. I have cards with D-SUB 15 pin and DVI, when i connect monitor only to DVI a get not picture at all, when i connect it to VGA i get it.. So i always thought that some card have D-SUB(VGA) as default and only connection when only 1 monitor is connected..

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 7 of 28, by havli

User metadata
Rank Oldbie
Rank
Oldbie

Well, if only one monitor is connected, then the card should autodetect where and sent signal to that port. And from what I remember it worked for me most of the time. Both GF3 Ti and GF4 Ti worked and most of other cards too. But I remember one GF 6800 GS which didn't work with DVI monitor at all. I consider that a defect, not feature.

HW museum.cz - my collection of PC hardware

Reply 8 of 28, by Tronix

User metadata
Rank Member
Rank
Member

It depends on what you mean by "old video cards". For example, many 486-P1 era videocards have Feature Connector . I don't know enough technical details, except Pinout, but it is possible that digital signals R,G,B,Hsync,Vsync go to it.

https://github.com/Tronix286/

Reply 9 of 28, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

I only care about Geforce 5 / Radeon 9200 and more modern.. these are if im not wrong first cards with DVI, Matrox had something too, 3dfx i never saw Vooodo 4/5 with digital output, Voodoo 3 was still analog for sure.
I looked for some Geforce 6 card, i saw that some manufactors have 2x DVI on Geforce 6600- so such problem, but most ones have DVI+VGA, Geforce 79xx has 2 DVI too, but there some low end cards like Geforce 7300 which still have analog + DVI. Radeon X8xx i only have cards with Analog + DVI, im not sure if there were 2x Digital variants? There 2x DVI is solution of my problem too.. but i would need replace card, if this is not Bios related problem.

True is i never too much tested dual monitor setup on such cards, so if DVI not worked for boot, i used analog, but i not sure if DVI worked at all, i will test it. I have for example Geforce 5600 which are booting with DVI but picture is broken - some additional lines and in WIndows part of picture is missing (so its defective for sure), but analog is fine.. but other cards, not give me DVI picture at boot at all -that is problem.

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 10 of 28, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

I have tested other X700 /X800 cards with VGA+DVI, various manufactors, i really thing that my theory is right - i never got boot picture on DVI port. Can someone confirm that its working for him? It could be some incompatibility with my monitors, i dont beleive that all these cards have dead DVI ports, too many cards.

I know that there are some high end Radeons with 2x DVI, but its is only workaround..

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 11 of 28, by Tiido

User metadata
Rank l33t
Rank
l33t

It depends on the particular BIOS used on the video card. Some of mine show same image on both outputs, some only show output on only one of the outputs and on some VGA is output is fixed resolution with scaling to attached display "native resolution" making things look blurry. Nearly all my ATI cards show image on both outputs while nVidia ones generally only show on one.

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 12 of 28, by spiroyster

User metadata
Rank Oldbie
Rank
Oldbie
ruthan wrote:

I have tested other X700 /X800 cards with VGA+DVI, various manufactors, i really thing that my theory is right - i never got boot picture on DVI port.

If this was the case, no one be able to access the bios unless they had a VGA plugged in? I haven't used VGA on a modern setup for a while, and can't remember not being able to access the bios?

Reply 13 of 28, by swaaye

User metadata
Rank l33t++
Rank
l33t++

X800 definitely outputs DVI on boot if it's connected. These single-link DVI cards can be picky about monitors though. There were compliance problems with the upper limits like 1920x1080/1200. Sometimes you don't get an image until the display driver loads. Seems like the display driver is much more capable of negotiating with the monitor than the card's firmware is.

Reply 14 of 28, by havli

User metadata
Rank Oldbie
Rank
Oldbie

Btw - X800 doesn't work with HDMI monitors (using DVI->HDMI adapter). But this is not related to this issue.

HW museum.cz - my collection of PC hardware

Reply 15 of 28, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie
havli wrote:

Btw - X800 doesn't work with HDMI monitors (using DVI->HDMI adapter). But this is not related to this issue.

For one of my monitoring im using DVI to HDMI cable to HDMI switch from which im using HDMI to DVI cable - to DELL 1600x1200 (im using 1 monitor for multiple testbenches / machines), but second is direct DVI connection to 24" HP 1920x1200 monitor. I can test direct connection for first monitor, but i doubt that it will work, i would at least expect some out of range signal message or but its completly ignored when its connected to X800..

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 16 of 28, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I tested an X800XT AGP which I have this evening and it does output DVI and VGA at same time. I was able to see POST and access the BIOS. Both outputs ran at the same resolution so that must be linked until the driver loads.

HDMI could be problematic. You may need to use a DVI EDID emulator to hide the HDMI information from the Radeon. I have had success with this setup.

Reply 17 of 28, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

Thanks, which brand is your card?

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.

Reply 19 of 28, by ruthan

User metadata
Rank Oldbie
Rank
Oldbie

BTW for which OSes are using these cards X8xx cards, Win7 is unsupported if im not wrong, so only Windows 98 and XP options are left. I would say that at least for older XP games its too slow.. so are using it for Win98?
In other thread some people said that Win98 compatibility of these cards sucks, especially in OpenGL im not sure about that..

Im old goal oriented goatman, i care about facts and freedom, not about egos+prejudices. Hoarding=sickness. If you want respect, gain it by your behavior. I hate stupid SW limits, SW=virtual world, everything should be possible if you have enough raw HW.