VOGONS


dgVoodoo 2 for DirectX 11

Topic actions

  • This topic is locked. You cannot reply or edit posts.

Reply 102 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t

MesaFX is a version of Mesa with integrated 3dfx hardware acceleration (using Glide, if I recall correctly.) - effectively an open-source 3dfx OpenGL ICD (not counting the leaked 3dfx sourcecode). By chaining MesaFX and dgVoodoo you'd have (yet another) OpenGL-to-Direct3D wrapper, passing through Glide along the way: OpenGL-to-Glide-to-Direct3D.

OK, I will have a closer look into that, but OGL->Glide wrapping has a practical problem on Windows (independently on the OGL implementation).
Term like rendering window or target is not part of the OpenGL API at all. I think this is because back in the heroes' era, OpenGL could be used to "remote draw" from a terminal to a central server. If that was the case, rendering target was the network itself, in form of meta-drawing OGL commands. If the machine was able to render locally then the rendertarget was a local videomemory bitmap. OGL suppots only "rendering contexts" with abstract rendertargets behind them.
The concrete Windows rendertarget implementation relies on DirectDraw: the OGL driver allocates a double or single ddraw buffer to the renderwindow and feed the OGL ICD with that (like D3D7 and earlier devices was created on top of ddraw).
Those system allocated rendertargets and render contexts can be managed by the wgl* functions. The point is, OGL ICD needs to be compatible with ddraw "ICD" to "understand" what behind the ddraw surfaces are. In an OGL->Glide scenario, the OGL ICD is the Glide wrapper itself, fed by ddraw surfaces (and textures).

Reply 104 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Has anyone tried Turok 2? Instead of 3D scenes I only get a solid color image. Title screens and such work though. I haven't changed any settings in DGVoodoo2.

Then it must be a bug. I'll check it as soon as I get a demo of the game. Thx for the feedback!

Reply 106 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
Dege wrote:

Then it must be a bug. I'll check it as soon as I get a demo of the game. Thx for the feedback!

I tried it with a GeForce 650, Intel 4000, and Radeon 6950 with identical results.

It's because 95% of the "rendering calculations" is done in shaders, instead of relying on legacy GPU features (with potential fallback paths, according to the caps of the GPU) whose combinations can be handled differently (badly, poorly) in each drivers.
It's more like a software emulation ported to the GPU. The only difference should only be is the performance between the GPUs.

Reply 107 of 3949, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Dege wrote:

It's because 95% of the "rendering calculations" is done in shaders, instead of relying on legacy GPU features (with potential fallback paths, according to the caps of the GPU) whose combinations can be handled differently (badly, poorly) in each drivers.
It's more like a software emulation ported to the GPU. The only difference should only be is the performance between the GPUs.

Yeah I read the readme and it is very impressive. An exciting approach.

I was also going to try Turok 1 but the game won't load without crashing (some Windows 7 issue with it). There's probably some fix I need to find.

Reply 109 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t

I was also going to try Turok 1 but the game won't load without crashing (some Windows 7 issue with it). There's probably some fix I need to find.

Turok 1 works fine for me. I'm not sure if it is a demo or the game, but the only trick is that it can only be started by File\Start Turok in the menu bar.

I've got Turok 2, but can't install it. Probably its installer does not like 64bit Windows, as usual with old games. I was fiddling with MS Compatibility Toolkit but no success so far. (Maybe it would be the same with Turok 1.)

Even the alpha blending? Neat. Would it be possible to mock the alpha artifacts explained here?

No! Alpha blending, depth and scissor testing (and stenciling in Napalm (and dithering!)) are in the remaining 5%, I think. 😊 Those cannot be moved into shaders (except for scissor).
Only if even rasterizing (triangles) were calculated by shaders, only in that case could alpha blending (and the others) be done "manually" in shaders. But that would really chop off performance.

I think the the author of the article about the alpha blending problem you mentioned has already answered his own question: 16 bit dithering. The problem with that appears when more than one polygons are blended onto each other, because the frame buffer image gets dithered at each step. The ideal solution would be rendering the frame buffer in higher quality (say, 32 bit...) and apply dithering at the final step when converting that into 16 bit. (So, idea of dithered rendering suffers from a theoretical misconcept, I think, but it was sufficient for most cases in practice.)
Edit: as far as I know, RivaTnt had a more sophisticated dithering algorithm than 3dfx (but maybe I'm wrong). Also, it's not for the same if blending is done at texture unit level (Tnt have 2 texture stages) in a single pass or by the alpha blender in two passes. In the latter case dithering takes place twice.

In dgVoodoo I always use 32 bit rendering. GPUs above DX9 level does not like 16 bit rendering, 32 bit is the "entrant". It's not a problem because it made the implementation easier. In spite of that, 3Dfx dithering levels (no dithering, 2x2, 4x4) are still emulated in some ways: no dithering means cutting pixels back to 16 bit, 2x2 is somewhat better, 4x4 means the full 32 bit.
(I was thinking on having an option to do real dithering to make the appearance true 3Dfx, but it would only be a nice extra, and wouldn't look exactly the same as a real 3Dfx, so I ditched the idea.)

Reply 110 of 3949, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Dege wrote:

Turok 1 works fine for me. I'm not sure if it is a demo or the game, but the only trick is that it can only be started by File\Start Turok in the menu bar.

I've got Turok 2, but can't install it. Probably its installer does not like 64bit Windows, as usual with old games. I was fiddling with MS Compatibility Toolkit but no success so far. (Maybe it would be the same with Turok 1.

My Turok 2 CD uses a 32 bit install. Maybe the demo you're using is different then.

Reply 111 of 3949, by Xenphor

User metadata
Rank Member
Rank
Member

I'm having trouble getting a good vsync in Sin with dgvoodoo2. Whether I try to enable in game, or force it with dgvoodoo, I can't get a fully syncd display. The tearing will be gone but there will be stuttering instead, suggesting that it's still not in sync. I tried limiting the framerate to 60,61fps etc. but that didn't help. And there seems to be no way to enable vsync in software mode.

It seems weird to use a glide wrapper for opengl games but the default opengl option crashed the game. Is there a old opengl -> new opengl wrapper or it has to be 3dfx?

Reply 112 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
Xenphor wrote:

I'm having trouble getting a good vsync in Sin with dgvoodoo2. Whether I try to enable in game, or force it with dgvoodoo, I can't get a fully syncd display. The tearing will be gone but there will be stuttering instead, suggesting that it's still not in sync. I tried limiting the framerate to 60,61fps etc. but that didn't help. And there seems to be no way to enable vsync in software mode.

It seems weird to use a glide wrapper for opengl games but the default opengl option crashed the game. Is there a old opengl -> new opengl wrapper or it has to be 3dfx?

What game do you have problems with?
Perfect vsync has some practical troubles:

- If a game doesn't use vsync but only drives the card by, say, constant 60Hz, tearing on a real 3Dfx card also occurs. Worse if the rendering speed is adjusted to early hardware (I mean bruteforce-rendering) because it is too fast on a modern hw and also kills the gpu. Forcing vSync in dgVoodoo is for the latter case, but of course, it probably end up some kind of lagging of the animation.

- Most 3Dfx games use vSync, but they expect a given display refresh rate (60Hz in general, but can be 70, 72, 75Hz too), it can be given when Glide initialized, and worked for all CRT displays, I think. But, modern displays only support one native refresh rate (exactly as they have only one native resolution). If it does not match the expected refresh rate of Glide then that can only be provided by frame skip (it all is handled by DGXI in dgVoodoo). The more the difference between the two frequencies the more laggy the rendering is going to be, I think. A solution for this could be letting the display be driven by its native refresh rate independently on the Glide refresh rate but it also can cause lagging because the assumed freq does not match the real freq (analogue problem to the texel mapping with forced resolutions). There were an option for this in old dgVoodoo but didn't wanted to include it in the new version. It is confusing and doesn't even really work in practice: for example, my display works with 59Hz, and, when I let it to be used then NFS3 worked awfully. Even the frameskipping solution proved to be much better.

Reply 113 of 3949, by F2bnp

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
Dege wrote:

Turok 1 works fine for me. I'm not sure if it is a demo or the game, but the only trick is that it can only be started by File\Start Turok in the menu bar.

I've got Turok 2, but can't install it. Probably its installer does not like 64bit Windows, as usual with old games. I was fiddling with MS Compatibility Toolkit but no success so far. (Maybe it would be the same with Turok 1.

My Turok 2 CD uses a 32 bit install. Maybe the demo you're using is different then.

I think I know what's wrong here. It's not because it is using a 16bit installer. It's probably something I've noticed since Windows Vista, but has become a lot more apparent under Windows 7 and 8. Certain games that use 32bit installers just don't work. When you double click, a process is created, but it just stays there, as if it is never given CPU time. Sometimes it just runs out of the blue. Most of the times it won't.

I have no idea what causes this and no matter how much I've looked into it, I can't find a solution.

Reply 114 of 3949, by swaaye

User metadata
Rank l33t++
Rank
l33t++
F2bnp wrote:

I think I know what's wrong here. It's not because it is using a 16bit installer. It's probably something I've noticed since Windows Vista, but has become a lot more apparent under Windows 7 and 8. Certain games that use 32bit installers just don't work. When you double click, a process is created, but it just stays there, as if it is never given CPU time. Sometimes it just runs out of the blue. Most of the times it won't.

I have no idea what causes this and no matter how much I've looked into it, I can't find a solution.

I've discovered that Logitech Setpoint causes problems like this.

Reply 115 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t

I think I know what's wrong here. It's not because it is using a 16bit installer. It's probably something I've noticed since Windows Vista, but has become a lot more apparent under Windows 7 and 8. Certain games that use 32bit installers just don't work. When you double click, a process is created, but it just stays there, as if it is never given CPU time. Sometimes it just runs out of the blue. Most of the times it won't.

Exactly the same case for me. I'm unsure if it is only for 64 bit Windows or 32 bit version too. The process get created, allocates some memory and GDI objects and that's all, it stalls.
Maybe attaching to the process with a debugger to see where it sticks would help, but... I thought there wasa general solution for this found by others.
I don't have Logitech Setpoint or similar things installed, so it must be a native windows speciality.

Reply 116 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
Xenphor wrote:
The game I was trying was Sin. […]
Show full quote

The game I was trying was Sin.

I tried nglide and that seems to work the way I expected; I left v-sync on in game, off in nglide, and I didn't notice a stutter, using same refresh rate/resolution. I always set my display to 60hz for v-sync no matter what game. The only problem is that it does seem to drop frames during explosions or other scenes, maybe because of lighting effects (?).

But if I just stare at the ground and pan the camera in circles, there doesn't appear to be any stutter.

Is there a reason why games using the same engine such as Quake 2 can run natively from openGL but others can't, like Sin? Does Nvidia add special exceptions to their drivers?

Ok, I insert this game to the list of "to be examined". 😀