First post, by wacha
Hi guys.
Like many of you on this forum, I have a collection of vintage PC hardware that I like to use and tinker with incessantly. Last year I bought an AGON AOC 1440p 144Hz monitor for my main rig, driven by my two 1080Ti's in SLI. I made sure that it supported not only the lastest standards, but also VGA, so that I could connect my vintage DOS/Win98/WinXp era hardware natively, without the use of adapters or dongles. The monitor has 2 DP, 2 HDMI and a VGA port. So far so good on the VGA side, as long as I stay at or below the maximum 1920x1080 resolution supported by the monitor in that mode.
Before I begin to describe my issue, I would like to say that my target is 1440p 60Hz on any GPU capable of that. The problem is that I have a bunch of cool cards from the 2005 to 2012 era that only have DVI ports. Some of the newer ones have one HDMI or mini-HDMI port but it will get disabled in Quad SLI or 4 Way Crossfire modes. So to solve the problem I bought one of those DVI to HDMI passive cables, and at first glance it seemed to work, but it doesn't. It works at a maximum of 1080p 60Hz, but it will only do 1440p 30Hz after forcing a custom resolution. By default it will show 1080p 60Hz.
I think I've been able to pinpoint the problem: the Single Link DVI 165 Mhz pixel clock limit. I have tested this forcing custom resolutions using Custom Resolution Utility 1.4.2. Anything beyond 165 Mhz Pixel Clock will make the image blurry and text unreadable. The connector is Dual Link, but I've since learned that half of the pins are dummies as HDMI is always Single Link and those extra pins are not connected to anything. The absolute maximum I've been able to achieve is 1440p 41Hz using the LCD-Reduced option. So my question is: can a DVI-D Dual Link output be converted to HDMI or Displayport? I've spent countless hours searching and reading misleading advertising and have only been able to find this: http://www.thruput.co.uk/home/product/videopr … g/GFVTCTLA.html
Sadly, I can't find it in stock anywhere. Do any of you have this same setup? How did you resolve this issue?
Now to the thing that completely baffles me: there is one instance where the cable works with a 1440p 60hz display. This is only possible with one of my GTX 295's. Installing the drivers will autodetect the native resolution of the monitor and automatically show it. No custom resolutions, no tinkering, it just works. I have 3 GTX 295's and the other two will display nothing when presented with this resolution. Mind you, it is the default resolution, so simply installing the drivers on any of those two cards will net you a black screen and a "no signal" message shortly afterwards. To be able to use any of those cards as a display with the adapter I have to install the drivers in the one that works, downgrade to 1080p, shutdown, and finally connect one of the other cards.
There is one other instance where I got this to work though, through forcing a custom resolution with the CVT - Reduced Blank option selected: my two GTX 590's. Nothing works with any of my other cards. Here's a list of the behaviour with different cards, when attempting 1440 60Hz:
-Gigabyte GTX 295 Single PCB: works out of the box, default resolution
-ZOTAC GTX 295 Single PCB: default resolution when installing drivers, but only shows black screen, have to downgrade to 1080p using the Gigabyte one
-Gainward 295 Dual PCB: same as the ZOTAC
-GTX 590's: defaults to 1080p, black screen when forcing 1440p, success with the CVT - Reduced Blank option selected
-HD 7990, 7950 GX2, HD 3870 X2, x1950 XTX, X850 Platinum Edition: defaults to 1080p, blurry image if Pixel Clock greater than 165 Mhz, no success
That's all I've tested for now, which seems enough. This situation leads me to believe that the cable might work, but I sincerely have no clue as to why it only does so in a single model of GTX 295. I've tried numerous driver versions, and it's always the same behaviour.
Thank you for your time. I hope we can resolve this and hopefully learn something in the process.