VOGONS


First post, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie

Hi Dege, Happy New Year 2019 😀

I have observed poor performance from dgVoodoo2 Glide emulation on accelerated VM using QEMU version 2.12/3.1.0 with WHPX acceleration on Windows 10 x64 Build 1803. I could not explain such slow-down other than the thoughts that dgVoodoo2 Glide emulation may have internal frame time calculation based on _grBufferSwap(int swap_interval) or the new FIFO buffering of Glide calls is messing up the timing within dgVoodoo2. OpenGlide and psVooodoo do not heed the swap_interval from _grBufferSwap(), they just flush out the rendering as fast as they can. Hence, both of them perform better than dgVoodoo2 on accelerated VM, to the point of absurdity that they render Unreal fly-by timedemo on VM faster than dgVoodoo2 natively 😲 This is unbelievable, and something must have gone wrong within dgVoodoo2 Glide emulation with very fast CPU.

Unreal.png
Filename
Unreal.png
File size
1.82 MiB
Views
854 views
File comment
Unreal on QEMU with acceleration
File license
Fair use/fair dealing exception

Check out the screenshot. This is psVoodoo on QEMU with acceleration. Both OpenGlide and psVoodoo are achieving ~47FPS average with the highest FPS hitting the limit of V-Sync at 60FPS, while dgVoodoo2 crawling around 22FPS on average *on the host machine*. Perhaps, you can look into why dgVoodoo2 Glide emulation is so slow with Unreal even on the host machine. When I switched out dgVoodoo2 with OpenGlide or psVoodoo on the host machine, Unreal fly-by timedemo literally was rendered at the limit of V-Sync. The problem with dgVoodoo2 is evident with high-resolution, especially at 1024x768. However, there are exceptions, so far Quake2 and Turok 2 do not have the slow-down problem with dgVoodoo2. Other than Unreal, NFS3 also has the same problem using the Glide2x thrashing driver from the retailed CD.

FYI, all the testing were done on my desktop with High DPI display and Windows 10 display scaling at 125%. Not sure if this would matter for dgVoodoo2. And my GPU is an old NVIDIA G210 with Direct3D level 10_1 and WDDM1.2.

Reply 1 of 5, by Dege

User metadata
Rank l33t
Rank
l33t

I'm afraid GT210 is too old for dgVoodoo2 Glide. It may sounds strange and stupid, but dgVoodoo2's Glide pixel shaders are much more complex (because of the way texturing handled) than those of traditional Glide wrappers, even when D3DCompiler is available and specialized shaders are compiled. (that's why I was hesitating on enabling DX10.0 in dgVoodoo2)

But, I'll have a look at grBufferSwap implentation. Fast CPU shouldn't be a problem but I cannot remember by heart what code there is. 😁

BTW, you could check out the GPU usage with dgVoodoo when Unreal is running (with Process Explorer or sg). If it's a timing bug then GPU usage should be low. If GPU usage is high, then it's a GPU overloading.

Reply 2 of 5, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie
Dege wrote:

I'm afraid GT210 is too old for dgVoodoo2 Glide. It may sounds strange and stupid, but dgVoodoo2's Glide pixel shaders are much more complex (because of the way texturing handled) than those of traditional Glide wrappers, even when D3DCompiler is available and specialized shaders are compiled. (that's why I was hesitating on enabling DX10.0 in dgVoodoo2)

Well, GT210 is DX10.1 by the way, but yeah, I know it's old. 😢 Perhaps, time to fork out savings for GPU update..... sigh.

Dege wrote:

BTW, you could check out the GPU usage with dgVoodoo when Unreal is running (with Process Explorer or sg). If it's a timing bug then GPU usage should be low. If GPU usage is high, then it's a GPU overloading.

GT210 does not have WDDM2.0 on Windows 10 to support GPU usage monitor in Task Manager View. I can use GPU-Z though. Would they be the same? What's your estimate of typical GPU utilization? I don't remember that I could see anything >50% from GPU-Z last time I checked, rendering NFS3 replay as 1024x768. It's usually around 20~25%, sometimes even lower. I thought 3Dfx Glide games are old stuffs that won't be taxing for today's modern hardware.

Is there a way to measure the performance level of pixel shaders that they are adequate for dgVoodoo2 use cases? Are AMD/Intel IGPs good enough?

Reply 3 of 5, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie

Hey, you're right. Running Unreal at 1024x768 with dgVoodoo2 Glide emulation is really taxing my old GPU at 99% from GPU-Z and over 95oC in temperature (since I had the noisy GPU fan removed.) It seems that DOSBox and QEMU TCG were never ever fast enough to drive that hard on the GPU shaders. But with accelerated VM, this is going to change......

OMG! 😲 Someone is going to kill his/her GPU, most likely on laptops, when the heat dissipation deteriorated as dust accumulated over time, by running dgVoodoo2 to play old games on new systems! Now, I know how heavy dgVoodoo2 banged on the GPU shaders...... NVIDIA/AMD must have loved you so much!

I got to try this out on my Intel Haswell IGP on the laptop, too.... What do you think about Radeon Vega 11 on the Ryzen APU?

Reply 4 of 5, by Dege

User metadata
Rank l33t
Rank
l33t

Yes, but in return, texture pipeline with even/odd sampling, detail factor, multibase mipmaps and all that kind of weird 3Dfx stuffs are supported + texture uploading, palette updating and texture descriptor setup are low cost (free). Back in the day that was the new-way implementation of mine.

Reply 5 of 5, by Dege

User metadata
Rank l33t
Rank
l33t
kjliew wrote:

I got to try this out on my Intel Haswell IGP on the laptop, too.... What do you think about Radeon Vega 11 on the Ryzen APU?

Uhmm… Unfortunately I have zero experience with AMD APU's but I think they must have better performance than their Intel counterparts.