VOGONS


dgVoodoo 2 for DirectX 11

Topic actions

Reply 2780 of 3949, by KainXVIII

User metadata
Rank Oldbie
Rank
Oldbie

Tried Xanadu Next with dgVoodoo, heavy stuttering and slowdowns if run without it (common problem in dx7-8 games in win8-10?), but with dgvoodoo runs fine (maybe alt-tab is broken, but i'm live with it).
Don't know about graphical issues though, needs more testing..

Reply 2781 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie

I have ho problem about cutscenes in SC using a mix of the lastest TAG widescreen mod for SC and dgvoodoo2 (remember to NOT use TAG d3d files).
TAG mod forces a borderless fullscreen mode for everything, cutscenes too.

Reply 2782 of 3949, by SeraphTC

User metadata
Rank Newbie
Rank
Newbie
lowenz wrote:

I have ho problem about cutscenes in SC using a mix of the lastest TAG widescreen mod for SC and dgvoodoo2 (remember to NOT use TAG d3d files).
TAG mod forces a borderless fullscreen mode for everything, cutscenes too.

the TAG widescreen mod forces them to display at native res though - which is very small on my desktop and tiny on the 3k screen on my laptop. I asked TAG about it, but he said he'd never managed to get them to display zoomed (as he had with SCPT).

EDIT: Also, it *still* drops to windowed mode for me.

EDIT: Without the widescreen patch, and with dgVoodoo, it runs ok if I force windowed mode. I'm not forcing resolution, just windowed mode. Remains to be seen if that will cause me issues with the water rendering. (NOW TESTED - Water seems fine)

Is it possible either with dgVoodoo or with the games ini files to specify the position of the window when the application starts? It would be great to be able to have it start up central to my screen rather than near the left edge and halfway off the bottom!

(Incidentally, I tried to set windowed mode using the settings in the ini file instead of forcing, but with dgVoodoo in place the game would not display correctly - most of the window was black whilst the game was horizontally 'squashed' to about a quarter of the width of the window).

Reply 2783 of 3949, by VicRattlehead

User metadata
Rank Newbie
Rank
Newbie

In Sacrifice, I'm experiencing flickering ground textures while panning the camera left and right. It's most noticeable when lightmaps are disabled and detail is set to lowest. EDIT: Flickering doesn't occur when both detail textures and bump detail maps are disabled.

EDIT: I think I figured out the issue. The version of the game I'm playing is the GOG release, which uses its own D3D wrapper called "Shiny Direct3D wrapper". Playing without dgvoodoo2 produces the same flickering so I thought maybe it's this Shiny wrapper that's at fault. So I downloaded the demo version of the game and tried dgvoodoo2 with it and what do you know, no more flickering. Looks like I'll have to get the original release to get around this issue.

EDIT 2: Apparently the demo also uses this "Shiny Direct3D wrapper" as stated in the System Information tab in the game's Options menu, I misinterpreted what it meant. Anyway, I circumvented having to download the whole game by just downloading Patch #3 of the game and extracting the original .exe and .dll files from the patch installer. The files turned out to be all the same as in the GOG release except for the .exe. After replacing the GOG .exe with the Patch #3 .exe and testing the game with dgvoodoo2, I STILL get the flickering. So I don't think the GOG release is at fault. Patch #3's changelog tells me that it brings a few changes to the game's rendering engine. Perhaps the flickering issue only becomes present after these changes. Whether or not this issue exists with configurations contemporary to the game is something I don't know.

Here is a pastebin of the text file that comes with Patch #3. The changes to the rendering engine are under "General Upgrades".
http://pastebin.com/ZqQSSQYy

Last edited by VicRattlehead on 2016-11-21, 10:32. Edited 2 times in total.

Reply 2784 of 3949, by gameragodzilla

User metadata
Rank Newbie
Rank
Newbie
Dege wrote:
Hmm... I don't think it has anything to do with GOG version (but who knows...), altough I cannot speak from experience as I have […]
Show full quote
gameragodzilla wrote:
I've honestly have no idea either and it sucks because otherwise it works really well. The cutscenes are at native resolution wh […]
Show full quote

I've honestly have no idea either and it sucks because otherwise it works really well. The cutscenes are at native resolution when originally it was only 640x480 (thanks to being able to force resolution) and the buffer lighting is still pretty beautiful even to this day.

I had thought it was because resolution was forced, but the brief stutter when activating thermal vision alongside playing the game is still there, just with the game stuck at low resolution.

I have no idea. Is it because I'm using the GOG version?

EDIT: I also modified the ini files to be widescreen prior to applying this fix. Could that be a factor? I'm just honestly at a loss for words at why this stutters like this. If I can get it to not be there, I might even start using this fix for Pandora Tomorrow rather than the Komat fix considering it doesn't seem to have the lighting glitch at the beginning level.

Hmm... I don't think it has anything to do with GOG version (but who knows...), altough I cannot speak from experience as I have the original game. Also, I applied the widescreen fix too and works without performance problems.

lowenz wrote:

Stutter=D3D11 compiling complex shaders?

They are compiled once, when needed and then cached for further usage.
For the first time they get compiled, a little pause in the gameplay can really occur, but no any time again from that point. Stuttering indicates continuous performance dropping, for some reason.

Oh, so that's normal? It doesn't always stutter for me, it just does so when:

1. First time I start up a level.

2. First time I switch to thermal vision

3. Whenever I Alt-Tab out and back in

4. Sometimes when entering a new area.

It's not a constant stutter, and most of the time it's relatively smooth, but it's just that there's still enough stuttering here and there to bother me, that's all. The Komat fix for Pandora Tomorrow doesn't have this, so is there a way you could make it so that everything is cached on loading?

Reply 2785 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie

It seems a shader compilation spike CPU usage case.....it is expected.

Reply 2786 of 3949, by gameragodzilla

User metadata
Rank Newbie
Rank
Newbie
lowenz wrote:

It seems a shader compilation spike CPU usage case.....it is expected.

Seems so. Would like to see if there's a way to mitigate it. I don't know why the Komat fix for Pandora Tomorrow doesn't have this aspect, though.

Reply 2787 of 3949, by Nucleoprotein

User metadata
Rank Member
Rank
Member
gameragodzilla wrote:
lowenz wrote:

It seems a shader compilation spike CPU usage case.....it is expected.

Seems so. Would like to see if there's a way to mitigate it. I don't know why the Komat fix for Pandora Tomorrow doesn't have this aspect, though.

But it does not use DX11. dgVoodoo is emulating DX8 shaders using DX11 shaders so they can be complex. If you use AMD hardware you can force shader cache on for SC main executable.

Reply 2788 of 3949, by gameragodzilla

User metadata
Rank Newbie
Rank
Newbie
Nucleoprotein wrote:
gameragodzilla wrote:
lowenz wrote:

It seems a shader compilation spike CPU usage case.....it is expected.

Seems so. Would like to see if there's a way to mitigate it. I don't know why the Komat fix for Pandora Tomorrow doesn't have this aspect, though.

But it does not use DX11. dgVoodoo is emulating DX8 shaders using DX11 shaders so they can be complex. If you use AMD hardware you can force shader cache on for SC main executable.

I use NVidia. Oh well, guess I'll deal with it unless there's a way to mitigate it there.

Reply 2789 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t

In short, the process is the following for shaders:

D3D8 (1.x) shader bytecode (coming from app) -> HLSL (translated by dgVoodoo) -> D3D11 (4.x) shader bytecode (D3DCompiler) -> GPU specific code (translated by the display driver)

The most time consuming part is compiling from HLSL to 4.x shader bytecode by D3DCompiler (independent on GPU).
Translating to GPU code could also be critical, that's why pre-warming shaders are generally recommended to avoid first-use performance penalty,
but I didn't really experienced such a (measurable) thing with dgVoodoo, so the lack of pre-warming is out of interest here, I think.

The process above can be completely done for vertex shaders at creation time (when the app creates the D3D8 shader), but not for D3D8-pixel shaders.
For those, only a HLSL template can be generated from which multiple concrete D3D11 shader instances are genereted later, during rendering, as needed according to the logical D3D8-pipeline state (deferred creation).
That's why some lagging occurs when entering new areas, looking at certain objects, etc.
When the D3D8-shaders are destroyed by the game (like when leaving a game level) then the cached instances are destroyed along with them.
So, when reloading a new level and they are recreated then the process repeats.
(When returning from Alt-Tab, the game may recreate its shaders, I don't know.)

Through Komat fix, pure D3D8 is used only, so compiling by D3DCompiler is completely missing, the driver eats 1.x shader code directly.

I have plans for avoiding the compile-bottleneck for the fixed function pipeline and Glide, since dgVoodoo has precompiled shaders, but that won't help for D3D8-shaders, unfortunately.

Reply 2790 of 3949, by gameragodzilla

User metadata
Rank Newbie
Rank
Newbie
Dege wrote:
In short, the process is the following for shaders: […]
Show full quote

In short, the process is the following for shaders:

D3D8 (1.x) shader bytecode (coming from app) -> HLSL (translated by dgVoodoo) -> D3D11 (4.x) shader bytecode (D3DCompiler) -> GPU specific code (translated by the display driver)

The most time consuming part is compiling from HLSL to 4.x shader bytecode by D3DCompiler (independent on GPU).
Translating to GPU code could also be critical, that's why pre-warming shaders are generally recommended to avoid first-use performance penalty,
but I didn't really experienced such a (measurable) thing with dgVoodoo, so the lack of pre-warming is out of interest here, I think.

The process above can be completely done for vertex shaders at creation time (when the app creates the D3D8 shader), but not for D3D8-pixel shaders.
For those, only a HLSL template can be generated from which multiple concrete D3D11 shader instances are genereted later, during rendering, as needed according to the logical D3D8-pipeline state (deferred creation).
That's why some lagging occurs when entering new areas, looking at certain objects, etc.
When the D3D8-shaders are destroyed by the game (like when leaving a game level) then the cached instances are destroyed along with them.
So, when reloading a new level and they are recreated then the process repeats.
(When returning from Alt-Tab, the game may recreate its shaders, I don't know.)

Through Komat fix, pure D3D8 is used only, so compiling by D3DCompiler is completely missing, the driver eats 1.x shader code directly.

I have plans for avoiding the compile-bottleneck for the fixed function pipeline and Glide, since dgVoodoo has precompiled shaders, but that won't help for D3D8-shaders, unfortunately.

Ah that's an interesting explanation. So I'm guessing the Komat fix just transates the code into something compatible with DirectX9+ while yours actually recompiles things and that's why there's a brief stutter? Why not use Komat's method then if it seems to function fine without the brief lagging? Though seems like Komat is only compatible with Pandora Tomorrow rather than Splinter Cell 1's buffer shadows so I guess there's a difference between the two games?

Reply 2791 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
gameragodzilla wrote:

Why not use Komat's method then if it seems to function fine without the brief lagging?

D3D11 can only accept shader byte code of version 4.x/5.0, so old D3D8/9 shaders (1.x - 3.x) cannot be utilized without recompiling.

gameragodzilla wrote:

Though seems like Komat is only compatible with Pandora Tomorrow rather than Splinter Cell 1's buffer shadows so I guess there's a difference between the two games?

Fixing the shadow buffers (SC1) could be possible through pure D3D8 but the fix needs additional logic over MS D3D8 then.

Reply 2792 of 3949, by gameragodzilla

User metadata
Rank Newbie
Rank
Newbie
Dege wrote:
D3D11 can only accept shader byte code of version 4.x/5.0, so old D3D8/9 shaders (1.x - 3.x) cannot be utilized without recompil […]
Show full quote
gameragodzilla wrote:

Why not use Komat's method then if it seems to function fine without the brief lagging?

D3D11 can only accept shader byte code of version 4.x/5.0, so old D3D8/9 shaders (1.x - 3.x) cannot be utilized without recompiling.

gameragodzilla wrote:

Though seems like Komat is only compatible with Pandora Tomorrow rather than Splinter Cell 1's buffer shadows so I guess there's a difference between the two games?

Fixing the shadow buffers (SC1) could be possible through pure D3D8 but the fix needs additional logic over MS D3D8 then.

Ah I see. Well I guess I'l just deal with it as the brief stutters here and there are worth playing with pretty lighting, though I do wish there was a version that could be done through pure D3D8, so the stutter doesn't exist at all.

Reply 2793 of 3949, by daniel_u

User metadata
Rank Member
Rank
Member
Dege wrote:
D3D11 can only accept shader byte code of version 4.x/5.0, so old D3D8/9 shaders (1.x - 3.x) cannot be utilized without recompil […]
Show full quote
gameragodzilla wrote:

Why not use Komat's method then if it seems to function fine without the brief lagging?

D3D11 can only accept shader byte code of version 4.x/5.0, so old D3D8/9 shaders (1.x - 3.x) cannot be utilized without recompiling.

gameragodzilla wrote:

Though seems like Komat is only compatible with Pandora Tomorrow rather than Splinter Cell 1's buffer shadows so I guess there's a difference between the two games?

Fixing the shadow buffers (SC1) could be possible through pure D3D8 but the fix needs additional logic over MS D3D8 then.

Can the usage of pure d3d8 fix the issue with color of the light in SC2? If this pure d3d8 way, fixes more things it could worth your effort.

EDIT: I know Komat dll has the same issue, but maybe he is missing something.?!

Reply 2794 of 3949, by JJXB

User metadata
Rank Newbie
Rank
Newbie

on the shader generation, is it possible to store and reuse already generated shader code for future launches to save on said stutters? or do the generation routines spit out different code per same scene or something making that impossible?
i know there's some stuff that other shader generation in some emulators do (async shader gen in ishiruka version of dolphin) but i don't know how feasible that would be for this use case since that would rely on threaded operations for it's functionality and that PC shader gen might not even make that approach feasible anyway?
not trying to sound stupid here, just throwing some ideas forward.

Reply 2795 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
daniel_u wrote:
Dege wrote:
D3D11 can only accept shader byte code of version 4.x/5.0, so old D3D8/9 shaders (1.x - 3.x) cannot be utilized without recompil […]
Show full quote
gameragodzilla wrote:

Why not use Komat's method then if it seems to function fine without the brief lagging?

D3D11 can only accept shader byte code of version 4.x/5.0, so old D3D8/9 shaders (1.x - 3.x) cannot be utilized without recompiling.

gameragodzilla wrote:

Though seems like Komat is only compatible with Pandora Tomorrow rather than Splinter Cell 1's buffer shadows so I guess there's a difference between the two games?

Fixing the shadow buffers (SC1) could be possible through pure D3D8 but the fix needs additional logic over MS D3D8 then.

Can the usage of pure d3d8 fix the issue with color of the light in SC2? If this pure d3d8 way, fixes more things it could worth your effort.

EDIT: I know Komat dll has the same issue, but maybe he is missing something.?!

No, that's a different story... (imprecision of old GPU's coming from fixed-point representation of shader variables should be emulated for that, if possible, it's strongly hw-dependent)

JJXB wrote:

on the shader generation, is it possible to store and reuse already generated shader code for future launches to save on said stutters? or do the generation routines spit out different code per same scene or something making that impossible?
i know there's some stuff that other shader generation in some emulators do (async shader gen in ishiruka version of dolphin) but i don't know how feasible that would be for this use case since that would rely on threaded operations for it's functionality and that PC shader gen might not even make that approach feasible anyway?
not trying to sound stupid here, just throwing some ideas forward.

No, it's not a stupid approach at all. The main problem with emulation is the fact that what shader is exactly needed turns out only at rendering time (as opposed to non-emulated stuffs like games where all needed shaders can be pre-generated before rendering).

Async shader generation on background threads is possible if the shader itself can be substituted by a general precompiled shader until the specific compiled one is not ready. This is not a viable way for D3D8 shaders because a precompiled general ubershader for them would be waay too complicated (and inefficient), so the only solution that makes the situation better is caching forever all compiled shader code that's ever compiled. But it's still not that simple because the number of them can be potentially infinite ( 😀, ok, say, too large) so limiting is needed, meaning destroying some of them after all, if limit is reached.

Reply 2796 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
SeraphTC wrote:

(Incidentally, I tried to set windowed mode using the settings in the ini file instead of forcing, but with dgVoodoo in place the game would not display correctly - most of the window was black whilst the game was horizontally 'squashed' to about a quarter of the width of the window).

SC1 always gets dropped to windowed mode, it's a known issue. The game exe should be patched to avoid that (incompatible with DXGI).

SeraphTC wrote:

Is it possible either with dgVoodoo or with the games ini files to specify the position of the window when the application starts? It would be great to be able to have it start up central to my screen rather than near the left edge and halfway off the bottom!

No, it's currently impossible through dgVoodoo.

---
@VirtuaIceman & Others: thanks for the reports! When I switch to bugfixing mode then I'll check them!

Reply 2797 of 3949, by gameragodzilla

User metadata
Rank Newbie
Rank
Newbie
Dege wrote:

No, it's not a stupid approach at all. The main problem with emulation is the fact that what shader is exactly needed turns out only at rendering time (as opposed to non-emulated stuffs like games where all needed shaders can be pre-generated before rendering).

Async shader generation on background threads is possible if the shader itself can be substituted by a general precompiled shader until the specific compiled one is not ready. This is not a viable way for D3D8 shaders because a precompiled general ubershader for them would be waay too complicated (and inefficient), so the only solution that makes the situation better is caching forever all compiled shader code that's ever compiled. But it's still not that simple because the number of them can be potentially infinite ( 😀, ok, say, too large) so limiting is needed, meaning destroying some of them after all, if limit is reached.

Is it possible to increase the limit so that at least an entire level's worth of caching is done prior to loading so there isn't the stutter for at least one level?

Sorry if it sounds like I'm talking aobut of my ass.

Reply 2798 of 3949, by VicRattlehead

User metadata
Rank Newbie
Rank
Newbie

I have a question related to the flickering issue I have in Sacrifice but it's also about dgvoodoo2 in general. If a visual bug is observed in a game using the WARP renderer, does that rule out any possibility of the bug being hardware or driver-specific? I've been trying to determine if the aforementioned flickering is specific to my GPU (AMD HD 6670) or driver (Catalyst 15.7) because I find it odd that I can't find any posts on the internet mentioning the bug (well, it is rather subtle after all.) So I tried the WARP renderer, and with that renderer I observed the same flickering issue.

Reply 2799 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie

WARP is not a "true" reference implementation so the answer is "Maybe" 😁

Try WARP without dgVoodoo2 (remove the dgVoogoo2 dll, disable the GPU in control panel and launch the game).