VOGONS


First post, by Pino

User metadata
Rank Newbie
Rank
Newbie

I'm trying to use my vintage computers on the same monitor I use my modern computer, the monitor is an LG 32" QHD resolution, 144Hz, it only has HDMI and DP inputs.

I bought a cheap DVI to hdmi adapter and use in combination with a generic HDMI cabble, here is what happens:

- If I use an AGP Geforce 6200 on either of my vintage computer (a Pentium II runing Win98SE and DOS, an Athlon XP runing Windows XP) I get image on boot, BIOS, DOS, Windows 1024x768 32bit and even up to 1920x1080 on XP.

- If I use any other AGP card from my collection with DVI out (Radeon 9700PRO, Geforce 5700ULTRA, Geforce 4 TI4200) I get image on boot, BIOS, DOS, but as soon as Windows starts the monitor goes black. I'm using the same driver for this geforces as I use with the 6200.
If I boot in safe mode I get image in Windows, at only 640x480 8bit, if I change to anything else the monitor goes black.

Any idea on what could be going on?

Thanks

Reply 1 of 4, by progman.exe

User metadata
Rank Newbie
Rank
Newbie

I had PCI and AGP cards in a Win2k machine back in the day, and at least once I thought a reinstall had failed: Windows was doing the install on the AGP, but on first boot using PCI. That monitor was off, and so I got just black screens....

Anyway, could it be that with the latter cards Windows doesn't know what they really are, so is only enabling (VGA?) on the VGA port. Or Windows does know, and is defaulting to single monitor on the VGA only.

Maybe you need to use the GUI on VGA to tick the box to extend the display, then you can change the primary, and un-extend onto the VGA port. But you have no VGA.... Can you access useful display properties in safe mode? Or safe mode with VGA (which might not work).

Other option, maybe hardcore, is to figure out the display settings in the registry and edit them in safe mode.

You cannot be into retro computing without relishing the thrill of killing something (or the inverse, keeping 9x alive), so perhaps someone can export their registry settings for a DVI-only set up on the suitable cards/drivers, and you can bodge things into place?

Good luck

Reply 2 of 4, by Pino

User metadata
Rank Newbie
Rank
Newbie

Thanks Progman

had some time this weekend to play with it and I was able to get the FX5700Ultra working with a different Forceware version, strangely as soon as I finish the driver installation it kicks the TV connection wizard, like I had a TV and a monitor connected to the video card, which I don't. Not a big deal, just went through the guide and in the end I configured a single monitor and everything is fine, can get up to 1920x1080 and 32bit color.

the ATIs are a different story, I plugged a VGA monitor in conjunction to the HDMI monitor I want to use and sure enough the ATis treat the VGA ouput as primary, for some reason it doesn't like my DVI to HDMI adapter and it keeps connecting and disconnecting the monitor on the digital port, I will have to try a different adapter/cable combination.

Reply 4 of 4, by Pino

User metadata
Rank Newbie
Rank
Newbie

Got a new DVI to HDMI cable to replace the DVI-HDMI adapter+HDMI cable and all my problems are gone.

Not sure why the ATI video cards were more sensitive to the first combo I used than NVIDIA's, I always thought that DVI to HDMI adapters where just a passive adapters.

Anyway, lesson learned not all cables are the same.