VOGONS


First post, by dada

User metadata
Rank Member
Rank
Member

Hi all. I was wondering if anyone knows if it's possible to set arbitrary resolutions in Windows 3.1—for example, 639x479 or 1366x768 or any other uncommon resolution.

This is something that works fine on e.g. Windows 98 (on VMware Fusion) or Mac OS 9 on Sheepshaver, but I don't think I've ever seen anyone demonstrating it on Windows 3.1 on any type of emulator. Is it maybe something that would require a custom driver for the emulator/VM? It would be quite cool to just be able to size your screen to whatever you want and use it that way, especially since having nice widescreen support would rule.

Reply 1 of 8, by creepingnet

User metadata
Rank Oldbie
Rank
Oldbie

I heard of a guy managing to get widescreen resolutions out of 3.1x by hacking the video drivers for his video card. I wish I could find the link for where I saw that but it might be gone as I read that over 10 years ago.

~The Creeping Network~
My Youtube Channel - https://www.youtube.com/creepingnet
Creepingnet's World - https://creepingnet.neocities.org/
The Creeping Network Repo - https://www.geocities.ws/creepingnet2019/

Reply 2 of 8, by superfury

User metadata
Rank l33t++
Rank
l33t++

On real hardware that is probably difficult, as every clock(and it's resulting horizontal pixels) is generated using dividers. And for pretty much all MDA and newer cards, those timings have a base divider of either 9(MDA-compatible) or 8(CGA-compatible, as well as EGA and (S)VGA) pixel clocks, meaning that everything (horizontally at least) must be a multiple of either 8 or 9 pixels. Then there's (at least for analog(VGA and up), perhaps digital as well) constraints on the receiving hardware, causing loss of sync(when the connected device can't keep up with horizontal or vertical rates) or permanent damage(for example feeding a too high vertical rate).
You could theoretically output a ~3125000Hz vertical sync onto a VGA using a 25MHz clock(8 pixels x 1 scanline), but that would burn out the display if you try to use it, if it can even keep up(and not lose sync immediately, which it probably will immediately when activated). Most displays are made with VGA compatibility in mind, so everything with the horizontal and vertical sync rate of 640x480x16 should work. The same for 800x600 for base SVGA monitors. Higher resolutions depend on the display itself for support.

Author of the UniPCemu emulator.
UniPCemu Git repository
UniPCemu for Android, Windows, PSP, Vita and Switch on itch.io

Reply 3 of 8, by dada

User metadata
Rank Member
Rank
Member
superfury wrote on 2021-12-16, 22:12:

On real hardware that is probably difficult, as every clock(and it's resulting horizontal pixels) is generated using dividers. And for pretty much all MDA and newer cards, those timings have a base divider of either 9(MDA-compatible) or 8(CGA-compatible, as well as EGA and (S)VGA) pixel clocks, meaning that everything (horizontally at least) must be a multiple of either 8 or 9 pixels. Then there's (at least for analog(VGA and up), perhaps digital as well) constraints on the receiving hardware, causing loss of sync(when the connected device can't keep up with horizontal or vertical rates) or permanent damage(for example feeding a too high vertical rate).
You could theoretically output a ~3125000Hz vertical sync onto a VGA using a 25MHz clock(8 pixels x 1 scanline), but that would burn out the display if you try to use it, if it can even keep up(and not lose sync immediately, which it probably will immediately when activated). Most displays are made with VGA compatibility in mind, so everything with the horizontal and vertical sync rate of 640x480x16 should work. The same for 800x600 for base SVGA monitors. Higher resolutions depend on the display itself for support.

Oh wow, really interesting, thanks for the write up. I figured this would be tricky for various reasons considering no one has done it.

And yeah, I'm mainly thinking about using it for emulation/virtualization here, rather than trying to natively output arbitrary resolutions to an actual device (except maybe for more common resolutions that you could theoretically do in a sensible way). Like it would be a very nice addition to any kind of virtualization setup.

I wonder how VMware Fusion does this for Windows 98. I'm not sure if there's a way to tell exactly what kind of sync it's using for a given resolution. Maybe I can install an old version of Powerstrip and find out that way. Either way I suppose actually writing a driver that's able to pull something like this off would require some very specialist knowledge.

Reply 4 of 8, by Jo22

User metadata
Rank l33t++
Rank
l33t++

I'm speaking under correction, but maybe VBE can partially solve the issue.
Well, at least in emulation. With a bit of hacking, too.

There are at least two modified Windows 3.1 SVGA drivers available that use VBE instead of chip-specific video modes.
Alas, there's a catch - the SVGA driver uses one, hardcode configuration permanently (1024x768 pels ?, 256c palettized).

By using video modes, it's essentially up to the VGA/VBE BIOS to take care of timings and colour depts.
And from what I see, there are unofficial and weird video mode numbers out there.

https://en.wikipedia.org/wiki/VESA_BIOS_Extensions

Switching resolutions "on the fly" was a feature that officiall came with Windows 98, maybe Windows 95 also.
However, I *believe*, some drivers by ATI or ELSA provided that ability on Windows 3.1x, too.

Edit: Another idea would be to use emulation of an early video accelerator. Like TIGA, for example.
TIGA was very intelligent and high-level. The driver shipped with Windows 3.1 was merely an interface/helper driver.
The actual work was done by the TIGA drivers supplied by the TIGA cards.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 5 of 8, by superfury

User metadata
Rank l33t++
Rank
l33t++

I've just thought about it again. Then I remembered something about (S)VGA latches!
The 9 pixel divider can only be effectively used for text modes (the MDA/VGA text modes to be exact, where each 8th pixel is doubled for line drawing characters and given color number 0 for all others). It cannot work properly in graphics modes, since only the equivalent of 32-bits are latched and shifted out, but a 9th pixel would require another 4 bits, which aren't latched, resulting in 4 0-bits after every 32-bits being mixed in due to shifting the 32-bit register 36 times(shifting in zeroes). Thus in 16 color modes a black vertical line after each 8 pixels and the same effect on the horizontal data in 8/16/24/32-bit modes, messing up the colors entirely.

That's why multiples of 8 pixels are the only option.

Author of the UniPCemu emulator.
UniPCemu Git repository
UniPCemu for Android, Windows, PSP, Vita and Switch on itch.io

Reply 6 of 8, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Superfurry, what you wrote makes sense to me.
Thanks a lot for the explanation, too.
It reminds me of my father's tech voodoo, who wrote a whole CP/M FDD driver for his Z80 PC when he was young..

Another approach - How about not using VGA specific technology, at all?
Virtualbox deprecated its synthetic "VGA" device (VBoxVGA) a longer time ago, anyway.
That's why 3D acceleration broke for Win XP guests from v6.1 onwards, also.
The old VGA device became unsafe, so its pass-through feature was scrapped/disabled.
The newer releases feature a different device, or two (VBoxSVGA, VMSVGA), strictly speaking.
https://forums.virtualbox.org/viewtopic.php?t=94795
https://forums.virtualbox.org/viewtopic.php?t=96010
https://superuser.com/questions/1403123/what- … a-in-virtualbox

Windows 3.1 itself isn't dependent on VGA, if non-Windows applications aren't used.
Also, Windows 3.0 started to have more flexible drivers and Windows 3.1 improved upon this (GDI acceleration, first GUI accelerators).
Windows 2.x driver, by comparison, were static still and could be selected during Windows installation only.

With a *little bit* of thinkering and the help of the Win 3 SDK/DDK, a matching Windows 3.1 driver could be compiled in the future.
Maybe one that's using the communications channels of the VM, even.
So it could draw an overlay output directly using the host's OpenGL, not sure.
If not, it could at least use the VMs frame buffer device directly, without any BIOS/VGA specific mechanisms.

I'm speaking under correction, of course. I'm merely a layman, after all.

Edit: From what I remember, Wabi was essentially loading the Windows 3.x Enhanced-Mode kernal atop of Solaris/Linux kernal.
It successfully ran it without any VGA or DOS support.

https://en.wikipedia.org/wiki/Wabi_(software)
https://www.youtube.com/results?search_query=WABI+Caldera
Linux desktops with Wabi

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 7 of 8, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie

There were custom builds for the Matrox Mystique so you could support fixed frequency screens , I owned one and it worked fine, could center and resize all in software too.

I remember some packages let you set everything custom within the limitations of the card, even had a utility to move the phase around in the case the lh & rh edges of the image were in the middle.
I was able to display 800x560 on a Tandy vga monitor but at a low refresh rate playing around with such a card.

I only really saw this capability under Windows 3.1 on older Matrox cards with custom drivers / control panel, without a custom bios driving a fixed frequency screen in plain dos could be dangerous