VOGONS


First post, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie

I am trying to set up all my retro to modern PCs with KVM switches. The older VGA + PS2 computers all connect to a VGA KVM, then into an OSSC to HDMI. Works great, just have to create profiles with appropriate timings for the different cards and resolutions.

I also have three P3 to Core 2 PCs connected to a DVI + USB + Audio KVM. This feeds a Gofanco HDMI + analog audio to HDMI box (audio inserter) via a DVI to HDMI cable. I figured this would be a piece of cake, but it has not been.

For one, my Radeon 8500 (Win ME, Catalyst 6.2) does not output to DVI until the driver loads (or it's incompatible with my TV for some reason.) I have not looked into this one much since it works OK once Windows is loaded.

The bigger problem is that it seems my Radeon 9800 Pro (XP, Catalyst 10.2) does not cope with switching inputs on the KVM. Once you disconnect the DVI output, you don't get it back. This is obviously a pretty serious show stopper. I tried bypassing the KVM and HDMI box with a DVI-HDMI cable and a passive coupler to the HDMI cable going to the TV. No luck. Once you unplug it, it's gone with nothing to do but power off and restart.

There's gotta be some of you switching DVI. What are you using? Do you have any trouble with it?

Reply 1 of 2, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I've been messing with 8500 lately. Looking at the DVI quirks among other things. The card attempts to scale all output to the native resolution of the monitor. DVI output beyond 1600x1200 is troublesome for its firmware based DVI functionality. You might see signal distortions or no image at all. It seems quite capable of 1920x1200 when a Windows driver loads. I suppose the driver works around the problems for the most part. I have a DVI EDID emulator and using that you can run through a variety of monitor configurations.

In contrast a GeForce 3 has no problem with 1920x1200. Most later GeForce cards are ok I think, aside from some of the cheapies like FX 5200. GeForce 256 and GeForce 2 cards with DVI are limited to about 1280x1024.

Later Radeons seem pretty good with 1920x1200 in general.

I have not tried a DVI switch unfortunately. I have a monitor with HDMI, DVI and DP so lots of options available.

Reply 2 of 2, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie

Interesting -- I had no idea it was scaling its output to the monitor's native resolution. I had an 8500 back in the day, but for a while I was using a Viewsonic VG150b (which did not have a DVI input), so I didn't know whether this was normal for the card or not. I have yet to connect it to more traditional PC LCD monitors - the place is a mess with builds-in-progress at the moment.

I believe my HDMI audio inserter has EDID handling options, but I don't recall to what extent. I think it may just be to advertise 2-ch or multichannel audio support. Might have to look into an EDID stripper to put inline with that KVM input.

Thanks for the suggestion -- it gives me something to try, at least!