Windows XP runs poorly with 128MB, the QEMU default if you don't have it in command line. It is fine for Win98/ME, but XP needs more. Give more to the VM, for eg. 1024MB. qemu-system-i386 -m 1024 ...
No one had told you to include kernel-irqchip=off if you are using KVM in Linux. It is only applicable for WHPX on Windows 10 as a workaround. APIC emulation has always worked on Linux KVM for years and it is a key advantage in KVM real-time characteristics and performance over other virtualization such as Windows 10 WHPX.
Just tried a fully updated XP SP3 VM with HVF on macOS, and while the OS runs fine in vanilla QEMU, with QEMU-3dfx + X11 things are freaking slow. Even FIFA 99 couldn't be played - The sound stutters a lot. Well, gotta wait for a M1 next year to see if things can improve.
@kjliew: Have you tested the qemu-3dfx performance loss when using a linux VM under Parallels v17 with your M1? I'd say it would be better than using Xquartz on macOS. But then, I acknowledge that it is kinda ridiculous to use a VM to run an emulator to run another legacy operating system for glide games...
OK, checklist… host has the patched qemu, has openglide, guest has the compiled wrapper files in their right locations… so what’s is missing… I have only glide passthrough, what about MESAGL passthrough? how to set it up so other games like CS 1.6 and GP3 can benefit? *scratches head*
The version of CS 1.6 I’m using is from a brazilian group called CS Revo. It works on XP VMware just fine. While in QEMU it just crashes right when it shows the main menu background.
I was also having problems to set up GP3 in XP (qemu), no idea why I have a black screen (I can hear the intro sound playing). While it works just fine in W98 even though I am using software mode because I don’t know how to set it up to benefit from passthrough.
NFS2SE I can notice some graphics glitches like transparency in objects like roads and near walls in second sector of the default racetrack. It’s the only game running nearly smooth here though.
FIFA 98 & 99 gets passthrough, but I need to fix them to a lower resolution because menu graphics are blurred from higher res, and they’re unplayable though because they are currently really slow.
Anyway these are the current status of the games I have right now. I am yet to try NBA LIVE 98 (first version with 3dfx) and Max Payne.
kjliewwrote on 2021-07-17, 01:37:This is the problem. Without an Intel Mac at my side, I think you will be mostly on your own. Or, you will have to bear the grun […] Show full quote
This is the problem. Without an Intel Mac at my side, I think you will be mostly on your own. Or, you will have to bear the grunts of C/C++ refresher course with me. 😉
Google is still your best friend, a few keywords suggestion for search, "GLX OpenGL", "OpenGL context GLX", "glX* functions".
1error: xp_attach_gl_context returned: 2
Here's the unproven theory of the error. OpenGlide was trying to steal a GLX context which was already attached to the native window. I am not an expert in OpenGL and below are just my assumption of the behavior of OpenGL implementation. When the implementation forbids or doesn't like GL context reattachment, it should simply return the Visuals or FBConfigs that was already in used regardless of the attributes defined by the calls or do some magics in the data being returned for the software to end up choosing the same as current. And so the next call to make the GL context current could simply be handled as NOP and everyone was happy. NVIDIA OpenGL implementation is particularly good at this. When the software insisted the required GL context attributes, the implementation was left with 2 options, go ahead & do as the software demanded, likely resulted in unoptimized behaviors. Or, the implementation simply did as "What Mr. Torvalds had shown to NVIDIA" and called it quit. There is always an ideology wrestles on how to handle such scenarios especially OpenGL had been inevitably vague in GL context creation.
So let's hope Intel OpenGL would still honor GL context reattachment as long as the same attributes were requested. These are the experiments that lead to the search. You will have to recompile OpenGlide multiple times.
Case #1, the big hammer. Fall back to old glXChooseVisual by always-false the if(0).
1In openglide/platform/linux/window.cpp: line 129: 2 if (0 && fbc && elements)
Here's the code for defining GL context attributes.
Case #2, completely remove line GLX_STENCIL_SIZE and try different GLX_DEPTH_SIZE. The 24 already failed, so try 32 and 16 decimal.
Case #3, completely remove both lines GLX_STENCIL_SIZE and GLX_DEPTH_SIZE. Hopefully, they will come back the same as current.
glidept: grSstWinOpen called, fmt 0 org 0 buf 2 aux 1 gLfb 0xde6fb000 FBConfig id 0x08f visual 0x0f6 swapUndef
If you will, I am interested to know the line for each experiments in #2 and #3.
I'm gonna try Case #1 again (the hammer) and compare the performance. Then update this post again.
EDIT: Case #1
glidept: grSstWinOpen called, fmt 0 org 0 buf 2 aux 1 gLfb 0xde74a000 Fallback to glXChooseVisual()
I'm using Need For Speed II SE to benchmark, but both Cases above rendered 30 fps although I felt the game a tad slower than I expected it to be. The good news is that Case #2 renders better than Case #3, because Case #3 causes some objects to appear as transparent or badly designed. Case #2 there is no bad designed object or transparent ones, but sometimes on track there was a small hole here and there, nothing disturbing though.
For Case #2 I used GLX_DEPTH_SIZE = 32. Didn't bother trying 16 - should I? Unless there is some improvement?
Case #1 was much slower than other two, 25 fps maximum capped. I'll definitely stick to Case #2 for the time being.
So you had finally gone back to re-read my post to be able to give back something for yourself to be helped, well done 👍 Otherwise, I do not possess the power of psychic to read one's mind... 🤣
I think I got the clues that Intel GPUs were really bad in their own past before they hired the AMD guy to start transforming the company. Linux open source somehow saved Intel GPUs from being utter trash but only on Linux. It is the same story on Windows camp. Once the GPUs passed their lifetime worthy of support, Intel stopped delivering drivers updates unless there were security vulnerabilities that put the company in liability for ignoring them. They even broke OpenGL support on Windows 10 for releasing drivers security updates for HD Graphics 2000 that required hex patching to gain back OpenGL acceleration.
So, forget about case #3, that ended up with pixel format without depth buffer which explained the rendering errors in OpenGLide (but I was surprised that it even rendered anything at all). Forget about 32-bit depth size, all of them are slow from the intelglx.log that you last provided. I don't know why 24-bit depth size failed for that was typical minimum requirement for OpenGL depth buffer. So that leaves only 16-bit depth size on the table, which isn't quite ideal but your intelglx.log seemed to give hints that 16-bit depth buffer is preferred. I can assure you that the Apple M1 GPU does not have such peculiarity.... (hey, don't be an Apple salesguy 🤣).
Intel HD 4000 (Haswell GT2) from Core i3-4010U and Intel HD 2000 (Sandy GT1) from Celeron 847 are both great with modern Linux support. Both of them just recently received modernized MESA drivers based on Gallium3D called "crocus". Otherwise it would take at least 5th-gen Intel HD 5000 Broadwell Graphics to use the similar Gallium3D driver called "iris". Perhaps if your attempts continue to be in vain, you may want to consider dual-boot Linux alongside with macOS. Dual-boot isn't always an ideal solution due to lacking of "High Availability" on having VMs, the modern ways of computing. But anyway, it is just VOGONS, and specifically for Marvin, no one cares... 🤣
For Case #2, setting 16-bit depth size... I couldn't feel a noticeable difference between 16-bit, 24-bit and 32-bit... Maybe 16-bit was a little, but very marginally, better. NFS2SE remains the most playable game of the VM, while FIFA 98 RTWC is almost playable, and FIFA 99 was at half the speed of FIFA 98 RTWC. I still haven't tried Grand Prix 3, Counter-Strike 1.6 because different reasons... GP3 I have no clue how to make it work for passthrough, while CS 1.6 was somewhat laggy. (Yes I am aware that I could play CS:GO for macOS, but it's not my goal atm). Max Payne crashed as expected (even patched the game was buggy as hell, from what I can remember 20 years ago).
I just need to know two things now, 1) how to make MESA passthrough work, and 2) if there is any way to reduce the resolution for these games? Because my win 98 desktop is set to 1920x1080 windowed, while as soon as I fire FIFA 98 for example, it gets resized to 1440x1080? I would rather play the original resolutions of these games (800x600? 640x480?) if I can't get them to be wide (VMware can do it, though).
@kjliew : I am still interested in knowing how did you get your passthrough to work with Grand Prix 3, so I could try it out with both VMs (the Win 98 and the HVF accelerated Win XP). GP3 is not a glide game, but from what I can remember it made use of more modern Voodoo implementations like Voodoo 2000 and voodoo 3000, or Banshee.
As it stands, GP3 can only start in software mode on both machines. However, I have another issue: GP3 will not start on Windows XP if I use the VBEMP NT driver. The game only started when I used the original cirrus vga driver. I needed VBEMP because of widescreen for my Windows desktop. The VBEMP 9X version works fine for Windows 98 VM, no issues. What is most impressive is that on XP VM I can get an excellent Processor Occupancy while playing the game in software mode.
What am I missing? If there are instructions, where are them?
EDIT: The XP VM was still not using HVF at this point.
Last edited by Bruninho on 2021-09-21, 03:05. Edited 1 time in total.
I managed to boot a XP VM under HVF acceleration and X11, but it is still damn slow - probably slower than Win 98 was, which is very strange at this point. However NFS2SE performance remains unchanged even with HVF on. Guess that I will have to wait until I put my hands in a M1 Mac.
Anyone have any tutorials or links to help me get qemu running on windows 10... I am looking for a win98 and a winxp box for retro gaming. (yes I know about pcem and 86box)... but I would like to try this one as eveyone tells me it is great, but I can not work out how to use it on windows... evey tutorial seems to ony be onn linux.