VOGONS


First post, by RetroMaster137

User metadata
Rank Newbie
Rank
Newbie

Hoping I posted it on the right section and isn't a dumb question. I AM dumb when it comes to hardware.

I don't know how to check VBE support other than actually installing MS-DOS and giving X program a try. My time is quite limited though and I don't think I could test in like a month, so I'd rather ask first.

Dedicated GPU is an AMD RX 580, integrated is Radeon HD 8570d, all under an ASUS A55BM-K motherboard.

Would it support VBE yet, or is it just too new? Any other info I could provide?

Thanks for the time.

Reply 1 of 7, by Starej_Mraf

User metadata
Rank Newbie
Rank
Newbie

Download this viewer http://files.mpoli.fi/software/DOS/GRAPHICS/QPV17E.ZIP . Install it or extract from INSTALL.DAT file VESATEST.EXE. It will test VBE ...

One of Founding Fathers of the OldComp.cz

Reply 2 of 7, by Gmlb256

User metadata
Rank l33t
Rank
l33t

Both AMD GPUs are too new to get adequate VESA support. Nevertheless, Scitech Display Doctor for DOS comes with the VBETest utility which is capable of reporting the VBE version and performing tests.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 3 of 7, by Jo22

User metadata
Rank l33t++
Rank
l33t++
Gmlb256 wrote on 2023-07-08, 19:54:

Both AMD GPUs are too new to get adequate VESA support. Nevertheless, Scitech Display Doctor for DOS comes with the VBETest utility which is capable of reporting the VBE version and performing tests.

I think the same. Essentially, VBE in early 21th century was nothing but a fallback.
Its main purpose was getting OSes like Linux to run if no native drivers were available.
VBE 3 makes this clear, I think. It focuses on desktop resolutions and colour depths.

The DOS era VBE was VBE 1.x and 2.x, with 2.x supporting Protected-Mode API.
Many VBE 2.x implementations also kept supporting a few of the more popular VBE 1.x modes, despite their legacy status.

So yeah, while some kind of VBE support in AGP/PCIe era graphics cards is technically *there*, it's nothing more than a compatibility layer.
Same goes for the VGA core. It's a gimmick, a bonus, a freebie. Not something that had been really put work in.
It maybe relates to things like the CGA compatibility of VGA cards. It's there, but superficial (no MC6845 registers, unless a extra emulation mode is activated via mode utility).

Personally, I think that VBE reached its height in ~1996.
That's when VBE/AF was out. While VBE 3 technically had introduced a useful 3D glasses support in 1998,
the 3D and VR movement of the 90s was already declining by that point in time.
An S3 Trio 32/64 or ViRGE with VBE 1.2 or similar card was supported by LCDBIOS and other 3D glasses libraries.

Anyway, I'm just a layman here. VBE 3 was interesting, but the hey day of VBE was with VBE 1.2/2.0.
That's when things still mattered, before Windows 9x took over.

A few words about UniVBE.. Scitech was a great company (their OS/2 Warp and Win 3.1x drivers were fascinating!), but UniVBE wasn't the final solution.
UniVBE isn't as good as native VBE support or the VBE support provided by the vendor VBE TSRs.
If you can, please consider trying to use the VBE TSRs by the graphics chip makers first.
They're old (VBE 1.x), but usually work properly. They don't crash applications as much as UniVBE dies.
UniVBE is a ladt resort if a game needs VBE 2.x support, though.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 4 of 7, by zyzzle

User metadata
Rank Member
Rank
Member
Jo22 wrote on 2023-07-09, 00:06:

So yeah, while some kind of VBE support in AGP/PCIe era graphics cards is technically *there*, it's nothing more than a compatibility layer.
Same goes for the VGA core. It's a gimmick, a bonus, a freebie. Not something that had been really put work in.
It maybe relates to things like the CGA compatibility of VGA cards. It's there, but superficial (no MC6845 registers, unless a extra emulation mode is activated via mode utility).

This "compatibility layer" is badly broken and / or missing entirely from "modern" video card bioses / firmware. Theoretically, it should be not very complicated to add it back in via TSR, if necessary on those modern broken cards. (eg, all modern core i3/i5/i7 vbioses and all AMD Ryzen onboard graphics). This pre-supposes that the systems are even able to boot to bare metal DOS in the first place, but such a TSR could "magically" fix the broken and missing VGA and VBE compatibility, at least CPUs from the 2nd or 3rd gen to the 10th gen core i series, provided there is a CSM module or non-UEFI compatibility layer in the system BIOS. Many such TSRs were written back in the day, for all the chipsets popular then, but none has ever appeared to "fix" the myriad problems of the new onboard video ROMs.

I keep wondering how to do this, but very little information is available on the internals of the Intel vBIOS code, and / or how to "remap" this broken code into a generic VBE-compatible homebrew code. Much like Univbe and VESAFIX did back in the day for 3dfx cards, for example. The VESA high resolutions need to be updated in a table and mapped, and one could use the cards generic refresh rate, probably 60 hz, as a fallback to not have to worry about custom-timings.

Reply 5 of 7, by Azarien

User metadata
Rank Oldbie
Rank
Oldbie
zyzzle wrote on 2023-07-09, 20:51:

This "compatibility layer" is badly broken and / or missing entirely from "modern" video card bioses / firmware. Theoretically, it should be not very complicated to add it back in via TSR, if necessary on those modern broken cards. (eg, all modern core i3/i5/i7 vbioses and all AMD Ryzen onboard graphics).

What was the "last good" generation of Intel CPUs?

Reply 7 of 7, by LSS10999

User metadata
Rank Oldbie
Rank
Oldbie
RetroMaster137 wrote on 2023-07-08, 15:54:
Hoping I posted it on the right section and isn't a dumb question. I AM dumb when it comes to hardware. […]
Show full quote

Hoping I posted it on the right section and isn't a dumb question. I AM dumb when it comes to hardware.

I don't know how to check VBE support other than actually installing MS-DOS and giving X program a try. My time is quite limited though and I don't think I could test in like a month, so I'd rather ask first.

Dedicated GPU is an AMD RX 580, integrated is Radeon HD 8570d, all under an ASUS A55BM-K motherboard.

Would it support VBE yet, or is it just too new? Any other info I could provide?

Thanks for the time.

The impact of VBE breakage is program/game specific, as the methods game used to output graphics vary greatly. There are some tools to diagnose VBE functions in a generic manner but only by actually running the game on the target system will you reach the conclusion whether it really works for you or not.

Although VBE should still work on your RX 580 (Ellesmere), it is no longer perfect and you may encounter glitches in some games. You might have better overall quality through your integrated video card than your RX 580, but I cannot be sure. According to AMD the cutline was around Southern Islands.

On the other hand, DisplayPort 1.2 support on video cards* appeared to coincide the VBE cutline for both nVidia and AMD that would be Kepler and Southern Islands respectively. AMD is likely similar in nVidia regarding this matter -- the later the generation, the worse the VBE functionality.

* There's an exception on the AMD side -- A few Northern Islands workstation cards, FirePro V5900 and V7900, had early DP 1.2 support, despite the rest of the Northern Islands cards are only DP1.1 complaint.

leileilol wrote on 2023-08-02, 23:16:

One could say Haswell because that was just before Spectre/Heartbleed mitigations and officially supports Windows 7.

From what I've tested LGA1150 (Haswell/Broadwell) does have DP1.2 to a limited extent, though I didn't test about VBE functionality as it was not being used as a DOS system... I couldn't find any reliable info regarding VBE functionality cutline for Intel GPUs so I'm not really sure.