VOGONS


First post, by stamasd

User metadata
Rank l33t
Rank
l33t

This came about in a discussion in sound/SBEMU and since it was off-topic there I am moving this here. Re: SBEMU: Sound Blaster emulation on AC97

Basically, Intel CPUs with iGPUs past HD5xx/6xx have broken VGA compatibility making some older games impossible to run. In my experience, Dune and Dune2 do not run. Same for Duke3D.

I have looked for solutions, but the older UniVBE binaries that I found on the internet archive only work for a specific range of hardware from the late 90s-early 2000s. They will do nothing with hardware that was released 15+ years later.

Are there any recent projects ained at patching buggy VBE implementations? I couldn't find any.

Also, how extensive is the problem? What other hardware is affected by it, except Intel's iGPUs from generations 6 and newer?

I/O, I/O,
It's off to disk I go,
With a bit and a byte
And a read and a write,
I/O, I/O

Reply 1 of 25, by Start me up

User metadata
Rank Newbie
Rank
Newbie

Are there any recent projects ained at patching buggy VBE implementations?

Well, kind of. There is a Windows 2000 fan project which has a universal graphics driver. The driver has VGA support implemented and currently works on Intel's generation 7 low cost (Bay Trail). There are many similarities between the versions of generation x. So once Gen7LC is implemented other versions like generation 4, generation 6 or generation 12 are easy to implement. The driver does not use the VESA BIOS Extension (VBE) at all so it does not inherit bugs from the BIOS.

When using DOS games in Windows the DOS game eventually ends up in the virtual DOS machine routines of the graphic driver. The VDM routines are basicly a VGA emulator. So it is possible to implement VGA support even if the graphics circuit does not support VGA at all. But most drivers have no VDM routines implemented so Windows uses it's own vga.sys for the VGA emulation.

vga.sys does a few checks but mostly just passes the accesses to the VGA ports to the graphics circuit. So if Gen6 has VGA problems then you will most likely experience them in the DOS game as well.

I am not that sure what would happen, if the game tries to access VBE directly because the VDM routines are only traps for ports. Not for interrupts. VBE is accessed by interrupts and VGA is accessed by ports.

Intel CPUs with iGPUs past HD5xx/6xx have broken VGA compatibility ... What other hardware is affected by it, except Intel's iGPUs from generations 6 and newer?

I am not aware of any significant VGA problems in Gen6. Could you describe what you have noticed, please? This could be rather helpful for me. Thank you.

Reply 2 of 25, by LSS10999

User metadata
Rank Oldbie
Rank
Oldbie
Start me up wrote on 2023-04-05, 11:29:
Well, kind of. There is a Windows 2000 fan project which has a universal graphics driver. The driver has VGA support implemented […]
Show full quote

Are there any recent projects ained at patching buggy VBE implementations?

Well, kind of. There is a Windows 2000 fan project which has a universal graphics driver. The driver has VGA support implemented and currently works on Intel's generation 7 low cost (Bay Trail). There are many similarities between the versions of generation x. So once Gen7LC is implemented other versions like generation 4, generation 6 or generation 12 are easy to implement. The driver does not use the VESA BIOS Extension (VBE) at all so it does not inherit bugs from the BIOS.

When using DOS games in Windows the DOS game eventually ends up in the virtual DOS machine routines of the graphic driver. The VDM routines are basicly a VGA emulator. So it is possible to implement VGA support even if the graphics circuit does not support VGA at all. But most drivers have no VDM routines implemented so Windows uses it's own vga.sys for the VGA emulation.

vga.sys does a few checks but mostly just passes the accesses to the VGA ports to the graphics circuit. So if Gen6 has VGA problems then you will most likely experience them in the DOS game as well.

I am not that sure what would happen, if the game tries to access VBE directly because the VDM routines are only traps for ports. Not for interrupts. VBE is accessed by interrupts and VGA is accessed by ports.

Intel CPUs with iGPUs past HD5xx/6xx have broken VGA compatibility ... What other hardware is affected by it, except Intel's iGPUs from generations 6 and newer?

I am not aware of any significant VGA problems in Gen6. Could you describe what you have noticed, please? This could be rather helpful for me. Thank you.

It's not just Intel iGPU. All modern GPUs start breaking VGA/VBE functionalities bit by bit. The later the generation, the worse the VGA BIOS.

I'm thinking about the same lately as the VGA BIOSes on modern video cards are close to totally unusable, and in some cases, turning on CSM could lead to totally black screen until OS loads.

Can you tell me more details about the fan project in question? I wonder if it's really possible to somehow patch/replace the broken VGA/VBE functionalities universally...

Reply 3 of 25, by Start me up

User metadata
Rank Newbie
Rank
Newbie

It's not just Intel iGPU. All modern GPUs start breaking VGA/VBE functionalities bit by bit. The later the generation, the worse the VGA BIOS.

Well, first of all, please correct me if I am wrong. But to my knowledge the VBE is not a part of the graphics card but a part of the BIOS. The BIOS has a graphics driver which provides very basic support for a few graphics standards. This means that if you use a graphics circuit by a less known manufacturer then VBE won't let you do anything but maybe what can be done with VGA. The support is limited not only to those few graphics standards but it is usually also limited to mode setting only. So if a software uses the BIOS Extension defined by VESA then there is usually no acceleration available. Not for 2D drawing like area filling, line drawing etc., not for 3D and not for video playback. With VBE 3.0 2D acceleration was introduced in a limited form but not every BIOS implemented it. The BIOS has also a standardized interface to access the functions of the graphics driver. This interface is called VBE. So it's not a question about whether the graphics circuit supports VBE (because it doesn't care who sends the instructions) but rather a question about whether the BIOS supports VBE, the graphics card and how good the implementation of the BIOS extension is.

Another thing is, that on modern hardware there is often no more BIOS. The BIOS was replaced by some motherboard manufactures with EFI. That was about in the year 2010. Not long afterwards EFI was replaced by UEFI. So, yes, the support for VBE is dropping because UEFI has a different graphics extention. Those new graphics extentions have a decreasing lifespan until they get replaced by something new.

... in some cases, turning on CSM could lead to totally black screen

When you referenced CSM, did you refer to cascaded shadow mapping? If this technique isn't implemented in software then you probably can't use it with VBE. First it would need to be part of the VBE definitions (my knowledge about the accelerations VBE 3.0 offers is limited) and second the BIOS programmer would need to implement it.

I wonder if it's really possible to somehow patch/replace the broken VGA/VBE functionalities universally...

Well, if you want to hack into VBE to fix it, then yes, it is possible. To call a VBE function the software triggers a software interrupt. The jump address (interrupt vector) can be read and overwritten. So a software which does the hack can read the jump address and then overwrite it, so that the next time when the interrupt is triggered a function of the hacker-software is called. The hacker software can then decide whether it satisfies the request itself or passes the request forward to the VBE. So you would be creating a VBE emulator similar to the VGA emulator some graphics drivers provide with their VDM routines. But you would need a universal graphics driver as a backbone to execute the requested functions. And that's where the driver from the fan project might come in in the future.

Can you tell me more details about the fan project in question?

The graphics driver of this project supports multiple graphics standards (at the moment just VGA but GenX is next in line). With this approach it can handle a lot of graphics circuits because many support at least VGA and then one additional manufacturer specific like Intel's GenX. The project releases a video presentation about it's progress from time to time.

The presentation 001 was about how to get Windows 2000 up and running on modern devices
The presentation 002 was about version 1 of the universal graphics driver.

The current version of the driver is version 12 and is already capable of detecting Gen7LC capable devices (but only detecting the support for the graphics standard; it can't use this graphics standard, yet).

If you like then you can have a look my Odysee-channel where I mirrow low quality versions of the presentations. But the driver is not a ready to use solution for gamers. The development started not long ago and features and support come bit by bit with the time. Since the initial requestor asked about a recent project in this area it is the closest I could think of.

https://odysee.com/@startmeup:2?view=playlists

Reply 4 of 25, by stamasd

User metadata
Rank l33t
Rank
l33t
Start me up wrote on 2023-04-05, 11:29:

I am not aware of any significant VGA problems in Gen6. Could you describe what you have noticed, please? This could be rather helpful for me. Thank you.

Black screen with blinking cursor when attempting to run Dune, Dune2 and Duke3d on a Dell Optiplex 3050m with a i5-6600T CPU using its iGPU (HD530) under MS-DOS 6.22

Last edited by stamasd on 2023-04-05, 19:46. Edited 1 time in total.

I/O, I/O,
It's off to disk I go,
With a bit and a byte
And a read and a write,
I/O, I/O

Reply 5 of 25, by stamasd

User metadata
Rank l33t
Rank
l33t
Start me up wrote on 2023-04-05, 17:43:

When you referenced CSM, did you refer to cascaded shadow mapping?

I think he may have been referring to the Compatibility Software Module of UEFI firmware.

I/O, I/O,
It's off to disk I go,
With a bit and a byte
And a read and a write,
I/O, I/O

Reply 6 of 25, by Start me up

User metadata
Rank Newbie
Rank
Newbie
stamasd wrote on 2023-04-05, 19:42:
Start me up wrote on 2023-04-05, 11:29:

I am not aware of any significant VGA problems in Gen6. Could you describe what you have noticed, please? This could be rather helpful for me. Thank you.

Black screen with blinking cursor when attempting to run Dune, Dune2 and Duke3d on a Dell Optiplex 3050m with a i5-6600T CPU using its iGPU (HD530) under MS-DOS 6.22

Like we would have if the graphics circuit never switched from text mode to graphics mode? If this is the case then Dune has noticed that the mode switch failed. Otherwise Dune would start writing into the video memory and cause gibberish text to appear on the display. But if Dune was using plain VGA directly it would most likely not notice when the mode switch fails.

stamasd wrote on 2023-04-05, 19:45:
Start me up wrote on 2023-04-05, 17:43:

When you referenced CSM, did you refer to cascaded shadow mapping?

I think he may have been referring to the Compatibility Software Module of UEFI firmware.

Ok thanks. Yes, this makes more sense.

Reply 7 of 25, by stamasd

User metadata
Rank l33t
Rank
l33t
Start me up wrote on 2023-04-05, 20:38:

Like we would have if the graphics circuit never switched from text mode to graphics mode? If this is the case then Dune has noticed that the mode switch failed. Otherwise Dune would start writing into the video memory and cause gibberish text to appear on the display. But if Dune was using plain VGA directly it would most likely not notice when the mode switch fails.

That happens too with other games, like Frontier: First Encounters. I mean the gibberish writing to VGA memory.

I/O, I/O,
It's off to disk I go,
With a bit and a byte
And a read and a write,
I/O, I/O

Reply 8 of 25, by Start me up

User metadata
Rank Newbie
Rank
Newbie

Well, if Dune gets stuck with a blinking cursor and Frontier writes gibberish then Frontier might be having a different problem.

About the Dune/Dune2/Duke Nukem problem: It sounds as if the loading process gets stuck in a very early stage. Why do you think that this problem is related to the graphics circuit?

Reply 9 of 25, by stamasd

User metadata
Rank l33t
Rank
l33t

They all work without issues on the previous generation iGPUs, HD4xxx on Ivy Bridge chips.

I/O, I/O,
It's off to disk I go,
With a bit and a byte
And a read and a write,
I/O, I/O

Reply 10 of 25, by Start me up

User metadata
Rank Newbie
Rank
Newbie

Your i5-6600T with the 6th generation core processor has a 9th generation graphics circuit alias Skylake (HD530 graphics).

Your other processor has a 7th generation graphics circuit alias Ivy Bridge (HD4000). It is very similar to the Gen7LC in my Intel Atom and also quite similar to Gen7.5.

I guess that you didn't just swap the processor but more or less the whole system when you tested the game on both systems. Is this right?

I would suggest to do some debugging to narrow down the problem. From what I read I doubt that this problem is caused by the graphics circuit.

Btw: Skylake has still VGA support.

Reply 11 of 25, by stamasd

User metadata
Rank l33t
Rank
l33t

Of course, because the chipsets in the 2 systems aren't compatible with the other's CPU. I was talking of the CPU generation, not the GPU generation. As in, "the xth CPU generation and its associated iGPU"
How do you suggest that I debug the problem? I have tried the VBE test program from UniVBE, and it crashes with garbage on the screen after the first 1-2 tests, when it tries to switch to higher modes than 320x200 on the 6600T. It does not crash on the Ivy Bridge.

(also, the same games don't run with the same symptoms, and the VBE test program crashes in the same way in a different system with a i5-7600T CPU and its HD630 iGPU).

I/O, I/O,
It's off to disk I go,
With a bit and a byte
And a read and a write,
I/O, I/O

Reply 12 of 25, by Start me up

User metadata
Rank Newbie
Rank
Newbie

I see. Now I understand why you think the problem is VBE related. Yes, that sounds reasonable. The only way I know to fix it would be the approach with the VBE emulator I mentioned before. But I doubt that it exists for DOS.

Sorry, but I can't help you with the debugging. I have no experience with debugging in DOS. Also the newest graphics circuit I have supports nothing newer than Gen8.

Btw: i5-7600T is Gen9.5 (Kaby Lake)

Have you tried to run the game in a DOS emulator which runs in Windows with an original driver from Intel?

Reply 13 of 25, by LSS10999

User metadata
Rank Oldbie
Rank
Oldbie

DOS emulators like DOSBox use their own emulation and are usually rendered with modern APIs so they don't really have anything to do with the VGA BIOS.

NTVDM uses Windows' default video driver to access the VGA BIOS, but it has its own limitations and there were patches to extend it.

I'm not an expert of the VGA BIOS and VBE... so I'm not sure how UniVBE implemented the VBE 2.0/3.0 functionalities, even on older video cards that predated the new standards. Can these functions be totally implemented in software (thus truly independent from hardware internals of different vendors)? Or it ultimately requires some interaction with the hardware internals, that one has to explicitly code for a specific hardware/vendor (so not truly "universal").

PS (partially off-topic): As for breakage of VGA BIOS... This thread is mainly about nVidia ones that became apparent starting with Kepler. From what I tested with some games, Kepler only broke the function mentioned in that thread's OP, and it still has a good amount of legacy video modes listed in the table, with some games still working okay in overall. Later generations broke even more, to the point that there are very few resolutions available in the table which could cause certain mode switching calls to fail (and it's up to the program to properly handle the call failure).

I had this game (Titus the Fox) which exhibits the following behaviors across various generations of nVidia cards.

Fermi and Kepler: Works okay.
Maxwell: Distorted colors in-game, but still playable.
Turing and Ampere: Stays in text-mode with some garbled stuffs, unplayable. However, the game's functionality still works and I can properly exit it via ESC. Most likely mode switching calls failed but the game did not care.

I didn't test Pascal, but it's expected to be the same if not worse than Maxwell. In general, VGA/VBE functionality can only get worse with newer video hardware, be it discrete or onboard.

Reply 14 of 25, by Start me up

User metadata
Rank Newbie
Rank
Newbie
LSS10999 wrote on 2023-04-06, 03:06:

DOS emulators like DOSBox use their own emulation and are usually rendered with modern APIs so they don't really have anything to do with the VGA BIOS

That's what I was hoping for.

Reply 15 of 25, by Falcosoft

User metadata
Rank Oldbie
Rank
Oldbie
LSS10999 wrote on 2023-04-06, 03:06:
... I'm not an expert of the VGA BIOS and VBE... so I'm not sure how UniVBE implemented the VBE 2.0/3.0 functionalities, even on […]
Show full quote

...
I'm not an expert of the VGA BIOS and VBE... so I'm not sure how UniVBE implemented the VBE 2.0/3.0 functionalities, even on older video cards that predated the new standards. Can these functions be totally implemented in software (thus truly independent from hardware internals of different vendors)? Or it ultimately requires some interaction with the hardware internals, that one has to explicitly code for a specific hardware/vendor (so not truly "universal").

PS (partially off-topic): As for breakage of VGA BIOS... This thread is mainly about nVidia ones that became apparent starting with Kepler. From what I tested with some games, Kepler only broke the function mentioned in that thread's OP, and it still has a good amount of legacy video modes listed in the table, with some games still working okay in overall. Later generations broke even more, to the point that there are very few resolutions available in the table which could cause certain mode switching calls to fail (and it's up to the program to properly handle the call failure).
...

In those pre-VBE times the situation was rather different. The so called 'SVGA cards' all implemented VGA+ resolutions in their own way. These new video modes were documented/known by trial and error and used the same real mode int 10h interrupt as earlier video modes. So eg. Trident 8900 could be set to 1024x768 256 color mode by calling:

MOV AX, 62h
INT 10h

and Tseng ET4000 could be set to the same mode by calling:

MOV AX, 38h
INT 10h

since these VGA+ modes required bigger than 64K memory they also implemented Bank/Video window paging in their own way (usually by writing VGA ports).
If you mapped these 2 procedures to VESA standard functions 4f02h (set video mode), 4f05h (video window paging) and also implemented 4f01h (return mode information) you basically got a VESA VBE 1.x compatible interface.

Today the situation is rather different since buggy Intel/Nvidia Video BIOS implementations have no known working real mode routines that set these 'SVGA modes' (e.g. through int 10h).
Only the buggy VESA VBE calls are known/documented.

Website, Facebook, Youtube
Falcosoft Soundfont Midi Player + Munt VSTi + BassMidi VSTi
VST Midi Driver Midi Mapper

Reply 16 of 25, by vanfanel

User metadata
Rank Newbie
Rank
Newbie

Hi,

I am experimenting with Intel HD 500 (not past HD 500, but the actual HD 500).

I am finding some strong incompatibilities with HD 500, but I understand the HD 500 should be mostly OK for DOS.

Games seem to have a hard time working. For example

-SuperFrog looks like this:

report5.jpg
Filename
report5.jpg
File size
153.04 KiB
Views
1568 views
File license
Public domain

-Prince of Persia 1.3 works OK with EGA and CGA modes, but with VGA(MCGA) says:

"Graphics mode not available"

C'mon! EGA and CGA but not VGA?

-Duke Nukem 3D works perfectly well in 320x200, but does not work with any of the VESA modes.

So, is HD 500 also bad for DOS? I thought is was an still-ok option...

Reply 17 of 25, by digger

User metadata
Rank Oldbie
Rank
Oldbie

Sounds like a new open source project should be started for a universal VBE 1.x/2.x/3x driver for modern GPUs, in the same vein as UniVBE and SciTech Display Doctor.

Although that would only fix VESA VBE compatibility in later games that support higher resolution VBE graphics modes. It wouldn't solve any register-level legacy VGA compatibility problems. That would require some additional port-trapping hardware emulation magic. Might still be doable, though. Especially now that there is a known way to trap I/O port access for both real mode and protected mode games, thanks to the breakthroughs made in the SBEMU project.

Reply 19 of 25, by vanfanel

User metadata
Rank Newbie
Rank
Newbie

Well, VINFO doesn't report any video modes on HD 500, but there's this "AdvanceCAB" set of utilities, available here:
http://mirrors.arcadecontrols.com/AdvanceMAME … b-download.html

The inlcuded utility called VGA.EXE can detect these modes on the HD 500:

report7.jpg
Filename
report7.jpg
File size
116.5 KiB
Views
1473 views
File license
Public domain

However, after loading this "VGA" TSR, Prince of Persia is still unable to find the VGA, and Aladding still works like this:

report7.jpg
Filename
report7.jpg
File size
116.5 KiB
Views
1473 views
File license
Public domain

..which leads me to think that HD 500 is reeeeaaaally flawed for DOS, way beyond VESA modes availability.

Attachments

  • report8.jpg
    Filename
    report8.jpg
    File size
    118.18 KiB
    Views
    1473 views
    File license
    Public domain