First post, by cde
TLDR: I patched the VGA BIOS ROM of my 9600 Pro to output email@example.com Hz instead of the VESA default of 75 Hz.
So basically I was looking for a LCD monitor to being able to play DOS games with the best fidelity, with overall disappointing results. Some monitors do not support 720x400@70Hz at all, some do but will output it stretched to full 1920x1080 even when the "preserve aspect ratio" option is enabled. As the video card stretches VGA 320x200 into 720x400, visual distortion (doubled pixels) appears with 1280x1024 LCDs. However the DVI output of the ATi cards I tested is much better since 320x200 is stretched internally to 1280x1024 which is then outputted directly. While nVIDIA has a lot of blur applied, ATi cards are almost pixel-perfect (see Re: Widescreen monitors and 4:3 aspect ratio compatibility thread). DVI is an attractive option especially for 4K screens that lack VGA input.
Unfortunately the VESA standard only requires 1280x1024@75 Hz to be supported in the preconfigured modes, and so my 9600 Pro when connected to my 4K ASUS VP28U screen outputs this resolution (1280x1024@75 Hz). This is not a huge problem for many games, but some games (such as Lotus III) tie their internal logic to the screen refresh rate, and so at 75 Hz Lotus III has slighly too fast music.
Now the VP28UGQ does support 1280x1024@70 Hz: this can be tested under Linux as follows
xrandr --newmode "1280x1024@70" 126 1280 1328 1440 1688 1024 1025 1028 1066 +hsync +vsync
xrandr --addmode DVI-0 "1280x1024@70"
xrandr --output DVI-0 --mode "1280x1024@70"
More precisely, the exact pixel clock required for 70.08 Hz is 126.10 MHz as per http://www.epanorama.net/faq/vga2rgb/calc.html . But how can one force this resolution in DOS? One possibility would be to hack the monitor's EDID, disabling bit 0 of EDID byte 36 (see https://en.wikipedia.org/wiki/Extended_Displa … tification_Data) and adding a custom detailed timing descriptor. That's pretty difficult though as it might require disassembling the display and patching the EEPROM. Another possibility is to plug a fake DVI cable with a specific I2C EEPROM with my custom EDID, wait for the PC to boot, and then plug the real display. In any case there's no guarantee that the card's BIOS ROM would actually use this custom detailed timing descriptor.
Another idead was to use UniRefresh or VBEHz, however this card only supports VBE 2.0 so that's out of the window. Then again another idea is, after DOS has booted, to issue the proper CRTC/PLL register writes to alter the pixel clock. While I believe this idea could work, after spending some time with the radeon drm driver in the Linux kernel, the whole CRTC and PLL parts are pretty complex and difficult to reimplement, so that would have required significant work.
Finally, I thought about patching the VGA BIOS ROM in order to alter the timings, and this approach eventually proved successful. After quite a bit of reverse-engineering, I was able to pinpoint the function responsible for parsing the EDID, and located after it a table of modelines. The first entry in this table is the mode that the ROM prefers (1280x1024@75 Hz) and contains in particular the pixel clock in the first two bytes:
BC 34 00 98 51 00 2A 40 10 90 13 00 00 00 00 00 00 1E
Above 0x34BC corresponds to 13500 which is the pixel clock for 1280x1024@75 Hz. So I changed it to 0x3142 (12610) which corresponds to 126.10 Mhz pixel clock, which is 70.08 Hz vertical refresh rate at 1280x1024:
42 31 00 98 51 00 2A 40 10 90 13 00 00 00 00 00 00 1E
This invalidates the CRC however, so the new ROM has to be fixed. I used for this RaBiT 2.2.1 (Radeon ATI BIOS Tuner, attached) and reflashed the card with atiflash 1.6 (also attached, newer versions do not detect this card).
The result is as expected: the card now outputs 70.08 Hz in DOS (as reported by the monitor itself) and Lotus III has perfect music! 😎
EDIT: the monitor has two HDMI inputs, and it appears only HDMI-2 supports 1280x1024@70 Hz, HDMI-1 says "out of range".